* [PATCH v1 00/42] BaseTools: refactoring patches
@ 2018-04-27 22:32 Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 01/42] BaseTools: FdfParser - update to remove duplicate constant value Jaben Carsey
` (42 more replies)
0 siblings, 43 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel
first goal in this series is reduction in meaningless memory allocation or
use. An example is creating lists from iterators for the sole purpose of
passing into anthoer function where the function would take the iterator.
Another example is making a list just to create a set.
second goal is begining of organizational changes. This includes moving
functions from one class to another if the function operates primarily on the
second class' data. Another example is if a class has a small function only
called in __init__, the logic can just be added to __init__.
imnportant note: one patch removes lots of trailing whitepsace, without making
any other changes.
Jaben Carsey (42):
BaseTools: FdfParser - update to remove duplicate constant value
BaseTools: AutoGen - update to remove duplicate constant value
BaseTools: check before accessing members in __eq__
BaseTools: this function has no purpose.
BaseTools: AutoGen - refactor assemble_variable
BaseTools: AutoGen - refactor dictionary access
BaseTools: AutoGen - GenVar refactor static methods
BaseTools: AutoGen - share StripComments API
BaseTools: AutoGen - refactor class factory
BaseTools: Eot - remove unused lists
BaseTools: Eot - refactor global data
BaseTools: AutoGen - remove global line
BaseTools: AutoGen - UniClassObject refactor static methods
BaseTools: refactor to use list not dict
BaseTools: eliminate {} from dictionary contructor call
BaseTools: remove Compound statements
BaseTools: Workspace - refactor a dict
BaseTools: move PCD size calculation functions to PcdClassObject
BaseTools: AutoGen - refactor out functions only called in __init__
BaseTools: AutoGen - refactor out a list
BaseTools: AutoGen - refactor out a useless class
BaseTools: AutoGen - no need to recompute
BaseTools: refactor __init__ functions to not compute temporary
variable
BaseTools: AutoGen - remove function no one calls
BaseTools: AutoGen - move function to clean file namespace
BaseTools: AutoGen - remove another function no one calls
BaseTools: Refactor to share GUID packing function
BaseTools: AutoGen - refactor function to remove extra variables
BaseTools: AutoGen - refactor more functions only called in __init__
BaseTools: remove unused member variable
BaseTools: remove redundant content in InfSectionParser
BaseTools: trim whitespace
BaseTools: AutoGen - add Opcode constants
BaseTools: standardize GUID and pack size
BaseTools: remove unused variable
BaseTools: GenFds - use existing shared string
BaseTools: missed a copyright update
BaseTools: Remove lists form set construction
BaseTools: refactor Depex optomization
BaseTools: dont make iterator into list if not needed
BaseTools: create base expression class
BaseTools: use set instead of list
BaseTools/Source/Python/AutoGen/AutoGen.py | 200 +--
BaseTools/Source/Python/AutoGen/BuildEngine.py | 25 +-
BaseTools/Source/Python/AutoGen/GenC.py | 111 +-
BaseTools/Source/Python/AutoGen/GenDepex.py | 127 +-
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 333 ++---
BaseTools/Source/Python/AutoGen/GenVar.py | 124 +-
BaseTools/Source/Python/AutoGen/IdfClassObject.py | 113 +-
BaseTools/Source/Python/AutoGen/InfSectionParser.py | 21 +-
BaseTools/Source/Python/AutoGen/StrGather.py | 26 +-
BaseTools/Source/Python/AutoGen/UniClassObject.py | 61 +-
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 141 +-
BaseTools/Source/Python/BPDG/BPDG.py | 56 +-
BaseTools/Source/Python/BPDG/GenVpd.py | 132 +-
BaseTools/Source/Python/BPDG/StringTable.py | 10 +-
BaseTools/Source/Python/Common/BuildVersion.py | 6 +-
BaseTools/Source/Python/Common/DataType.py | 26 +-
BaseTools/Source/Python/Common/Database.py | 17 +-
BaseTools/Source/Python/Common/Expression.py | 97 +-
BaseTools/Source/Python/Common/MigrationUtilities.py | 66 +-
BaseTools/Source/Python/Common/Misc.py | 109 +-
BaseTools/Source/Python/Common/MultipleWorkspace.py | 17 +-
BaseTools/Source/Python/Common/RangeExpression.py | 159 +--
BaseTools/Source/Python/Common/String.py | 14 +-
BaseTools/Source/Python/Common/ToolDefClassObject.py | 5 +-
BaseTools/Source/Python/Common/VariableAttributes.py | 12 +-
BaseTools/Source/Python/Common/VpdInfoFile.py | 84 +-
BaseTools/Source/Python/CommonDataClass/FdfClass.py | 28 +-
BaseTools/Source/Python/Ecc/CLexer.py | 8 +-
BaseTools/Source/Python/Ecc/CParser.py | 1468 ++++++++++----------
BaseTools/Source/Python/Ecc/Check.py | 22 +-
BaseTools/Source/Python/Ecc/CodeFragment.py | 3 +-
BaseTools/Source/Python/Ecc/CodeFragmentCollector.py | 124 +-
BaseTools/Source/Python/Ecc/Configuration.py | 10 +-
BaseTools/Source/Python/Ecc/Ecc.py | 26 +-
BaseTools/Source/Python/Ecc/Exception.py | 14 +-
BaseTools/Source/Python/Ecc/FileProfile.py | 5 +-
BaseTools/Source/Python/Ecc/MetaDataParser.py | 46 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py | 4 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 100 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py | 88 +-
BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 4 +-
BaseTools/Source/Python/Ecc/Xml/__init__.py | 6 +-
BaseTools/Source/Python/Ecc/c.py | 12 +-
BaseTools/Source/Python/Eot/CLexer.py | 8 +-
BaseTools/Source/Python/Eot/CParser.py | 1468 ++++++++++----------
BaseTools/Source/Python/Eot/Eot.py | 21 +-
BaseTools/Source/Python/Eot/EotGlobalData.py | 41 -
BaseTools/Source/Python/Eot/Report.py | 4 +-
BaseTools/Source/Python/GenFds/Capsule.py | 2 +-
BaseTools/Source/Python/GenFds/CapsuleData.py | 18 +-
BaseTools/Source/Python/GenFds/EfiSection.py | 8 +-
BaseTools/Source/Python/GenFds/Fd.py | 2 +-
BaseTools/Source/Python/GenFds/FdfParser.py | 173 ++-
BaseTools/Source/Python/GenFds/Ffs.py | 10 +-
BaseTools/Source/Python/GenFds/FfsFileStatement.py | 4 +-
BaseTools/Source/Python/GenFds/FfsInfStatement.py | 62 +-
BaseTools/Source/Python/GenFds/Fv.py | 70 +-
BaseTools/Source/Python/GenFds/GenFds.py | 32 +-
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 36 +-
BaseTools/Source/Python/GenFds/GuidSection.py | 2 +-
BaseTools/Source/Python/GenFds/OptRomFileStatement.py | 6 +-
BaseTools/Source/Python/GenFds/OptRomInfStatement.py | 21 +-
BaseTools/Source/Python/GenFds/OptionRom.py | 49 +-
BaseTools/Source/Python/GenFds/Region.py | 4 +-
BaseTools/Source/Python/GenFds/Section.py | 2 +-
BaseTools/Source/Python/GenFds/Vtf.py | 18 +-
BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 28 +-
BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py | 6 +-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 34 +-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 30 +-
BaseTools/Source/Python/Table/Table.py | 20 +-
BaseTools/Source/Python/Table/TableDataModel.py | 14 +-
BaseTools/Source/Python/Table/TableDec.py | 12 +-
BaseTools/Source/Python/Table/TableDsc.py | 12 +-
BaseTools/Source/Python/Table/TableEotReport.py | 6 +-
BaseTools/Source/Python/Table/TableFdf.py | 12 +-
BaseTools/Source/Python/Table/TableFile.py | 12 +-
BaseTools/Source/Python/Table/TableFunction.py | 8 +-
BaseTools/Source/Python/Table/TableIdentifier.py | 4 +-
BaseTools/Source/Python/Table/TableInf.py | 12 +-
BaseTools/Source/Python/Table/TablePcd.py | 4 +-
BaseTools/Source/Python/Table/TableReport.py | 6 +-
BaseTools/Source/Python/TargetTool/TargetTool.py | 24 +-
BaseTools/Source/Python/Trim/Trim.py | 20 +-
BaseTools/Source/Python/Workspace/BuildClassObject.py | 57 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 41 +-
BaseTools/Source/Python/Workspace/InfBuildData.py | 12 +-
BaseTools/Source/Python/Workspace/MetaDataTable.py | 4 +-
BaseTools/Source/Python/Workspace/MetaFileParser.py | 2 +-
BaseTools/Source/Python/Workspace/MetaFileTable.py | 88 +-
BaseTools/Source/Python/Workspace/WorkspaceDatabase.py | 24 +-
BaseTools/Source/Python/build/BuildReport.py | 36 +-
BaseTools/Source/Python/build/build.py | 19 +-
BaseTools/Source/Python/sitecustomize.py | 2 +-
94 files changed, 3207 insertions(+), 3463 deletions(-)
--
2.16.2.windows.1
^ permalink raw reply [flat|nested] 44+ messages in thread
* [PATCH v1 01/42] BaseTools: FdfParser - update to remove duplicate constant value
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 02/42] BaseTools: AutoGen " Jaben Carsey
` (41 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
PCD size by type is shared so this change both removes duplication
and makes the function work for all numeric PCD types.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/GenFds/FdfParser.py | 29 ++++++++++----------
1 file changed, 14 insertions(+), 15 deletions(-)
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 25755a9778f2..80ff3ece43b4 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -1134,21 +1134,20 @@ class FdfParser:
@staticmethod
def __Verify(Name, Value, Scope):
- if Scope in [TAB_UINT64, TAB_UINT8]:
- ValueNumber = 0
- try:
- ValueNumber = int (Value, 0)
- except:
- EdkLogger.error("FdfParser", FORMAT_INVALID, "The value is not valid dec or hex number for %s." % Name)
- if ValueNumber < 0:
- EdkLogger.error("FdfParser", FORMAT_INVALID, "The value can't be set to negative value for %s." % Name)
- if Scope == TAB_UINT64:
- if ValueNumber >= 0x10000000000000000:
- EdkLogger.error("FdfParser", FORMAT_INVALID, "Too large value for %s." % Name)
- if Scope == TAB_UINT8:
- if ValueNumber >= 0x100:
- EdkLogger.error("FdfParser", FORMAT_INVALID, "Too large value for %s." % Name)
- return True
+ # value verification only applies to numeric values.
+ if scope not in TAB_PCD_NUMERIC_TYPES:
+ return
+
+ ValueNumber = 0
+ try:
+ ValueNumber = int(Value, 0)
+ except:
+ EdkLogger.error("FdfParser", FORMAT_INVALID, "The value is not valid dec or hex number for %s." % Name)
+ if ValueNumber < 0:
+ EdkLogger.error("FdfParser", FORMAT_INVALID, "The value can't be set to negative value for %s." % Name)
+ if ValueNumber > MAX_VAL_TYPE[Scope]:
+ EdkLogger.error("FdfParser", FORMAT_INVALID, "Too large value for %s." % Name)
+ return True
## __UndoToken() method
#
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 02/42] BaseTools: AutoGen - update to remove duplicate constant value
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 01/42] BaseTools: FdfParser - update to remove duplicate constant value Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 03/42] BaseTools: check before accessing members in __eq__ Jaben Carsey
` (40 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
PCD size by type is shared. just use it.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 13 ++++---------
1 file changed, 4 insertions(+), 9 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index df7a9b889aa4..602c90b3fe08 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -250,16 +250,11 @@ class VAR_CHECK_PCD_VALID_OBJ(object):
self.data = set()
self.ValidData = True
self.updateStorageWidth()
+
def updateStorageWidth(self):
- if self.PcdDataType == TAB_UINT8 or self.PcdDataType == "BOOLEAN":
- self.StorageWidth = 1
- elif self.PcdDataType == TAB_UINT16:
- self.StorageWidth = 2
- elif self.PcdDataType == TAB_UINT32:
- self.StorageWidth = 4
- elif self.PcdDataType == TAB_UINT64:
- self.StorageWidth = 8
- else:
+ try:
+ self.StorageWidth = int(MAX_SIZE_TYPE[self.PcdDataType])
+ except:
self.StorageWidth = 0
self.ValidData = False
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 03/42] BaseTools: check before accessing members in __eq__
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 01/42] BaseTools: FdfParser - update to remove duplicate constant value Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 02/42] BaseTools: AutoGen " Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 04/42] BaseTools: this function has no purpose Jaben Carsey
` (39 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
minimize risk for exceptions.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 5 +----
BaseTools/Source/Python/build/build.py | 3 ++-
2 files changed, 3 insertions(+), 5 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 602c90b3fe08..1328dddf1a10 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -259,10 +259,7 @@ class VAR_CHECK_PCD_VALID_OBJ(object):
self.ValidData = False
def __eq__(self, validObj):
- if self.VarOffset == validObj.VarOffset:
- return True
- else:
- return False
+ return validObj and self.VarOffset == validObj.VarOffset
class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
def __init__(self, VarOffset, validlist, PcdDataType):
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 36bb1fecf7e5..1c26e72feb6b 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -377,7 +377,8 @@ class BuildUnit:
# @param Other The other BuildUnit object compared to
#
def __eq__(self, Other):
- return Other is not None and self.BuildObject == Other.BuildObject \
+ return Other and self.BuildObject == Other.BuildObject \
+ and Other.BuildObject \
and self.BuildObject.Arch == Other.BuildObject.Arch
## hash() method
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 04/42] BaseTools: this function has no purpose.
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (2 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 03/42] BaseTools: check before accessing members in __eq__ Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 05/42] BaseTools: AutoGen - refactor assemble_variable Jaben Carsey
` (38 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
it looks like a old POC of the concepts then used to make the classes
in the file.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 17 -----------------
1 file changed, 17 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 1328dddf1a10..dd78dc520075 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -324,20 +324,3 @@ class VAR_VALID_OBJECT_FACTORY(object):
return VAR_CHECK_PCD_VALID_LIST(VarOffset, PcdClass.validlists, PcdClass.DatumType)
else:
return None
-
-if __name__ == "__main__":
- class TestObj(object):
- def __init__(self, number1):
- self.number_1 = number1
- def __eq__(self, testobj):
- if self.number_1 == testobj.number_1:
- return True
- else:
- return False
- test1 = TestObj(1)
- test2 = TestObj(2)
-
- testarr = [test1, test2]
- print TestObj(2) in testarr
- print TestObj(2) == test2
-
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 05/42] BaseTools: AutoGen - refactor assemble_variable
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (3 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 04/42] BaseTools: this function has no purpose Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 06/42] BaseTools: AutoGen - refactor dictionary access Jaben Carsey
` (37 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
make this function @staticmethod since self parameter is not used.
change valuelist to valuedict since it is a dictionary.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenVar.py | 8 +++++---
1 file changed, 5 insertions(+), 3 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 13bcf99b2627..b8f40da9a39c 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -113,9 +113,10 @@ class VariableMgr(object):
indexedvarinfo[key] = [var_info(n.pcdindex,n.pcdname,n.defaultstoragename,n.skuname,n.var_name, n.var_guid, "0x00",n.var_attribute,newvaluestr , newvaluestr , DataType.TAB_VOID)]
self.VarInfo = [item[0] for item in indexedvarinfo.values()]
- def assemble_variable(self, valuelist):
- ordered_offset = sorted(valuelist.keys())
- ordered_value = [valuelist[k] for k in ordered_offset]
+ @staticmethod
+ def assemble_variable(valuedict):
+ ordered_offset = sorted(valuedict.keys())
+ ordered_value = [valuedict[k] for k in ordered_offset]
var_value = []
num = 0
for offset in ordered_offset:
@@ -126,6 +127,7 @@ class VariableMgr(object):
var_value += ordered_value[num]
num +=1
return var_value
+
def process_variable_data(self):
var_data = collections.defaultdict(collections.OrderedDict)
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 06/42] BaseTools: AutoGen - refactor dictionary access
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (4 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 05/42] BaseTools: AutoGen - refactor assemble_variable Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 07/42] BaseTools: AutoGen - GenVar refactor static methods Jaben Carsey
` (36 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
dont use dict.get() inside loops of dictionary contents. its not needed.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenVar.py | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index b8f40da9a39c..b9517d84c690 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -144,7 +144,7 @@ class VariableMgr(object):
default_data_buffer = ""
others_data_buffer = ""
tail = None
- default_sku_default = indexedvarinfo.get(index).get((DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT))
+ default_sku_default = indexedvarinfo[index].get((DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT))
if default_sku_default.data_type not in DataType.TAB_PCD_NUMERIC_TYPES:
var_max_len = max([len(var_item.default_value.split(",")) for var_item in sku_var_info.values()])
@@ -159,11 +159,11 @@ class VariableMgr(object):
var_data[(DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT)][index] = (default_data_buffer,sku_var_info[(DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT)])
- for (skuid,defaultstoragename) in indexedvarinfo.get(index):
+ for (skuid,defaultstoragename) in indexedvarinfo[index]:
tail = None
if (skuid,defaultstoragename) == (DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT):
continue
- other_sku_other = indexedvarinfo.get(index).get((skuid,defaultstoragename))
+ other_sku_other = indexedvarinfo[index][(skuid,defaultstoragename)]
if default_sku_default.data_type not in DataType.TAB_PCD_NUMERIC_TYPES:
if len(other_sku_other.default_value.split(",")) < var_max_len:
@@ -220,7 +220,7 @@ class VariableMgr(object):
for skuname,defaultstore in var_data:
if (skuname,defaultstore) == (DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT):
continue
- pcds_sku_data = var_data.get((skuname,defaultstore))
+ pcds_sku_data = var_data[(skuname,defaultstore)]
delta_data_set = []
for pcdindex in pcds_sku_data:
offset = var_data_offset[pcdindex]
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 07/42] BaseTools: AutoGen - GenVar refactor static methods
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (5 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 06/42] BaseTools: AutoGen - refactor dictionary access Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 08/42] BaseTools: AutoGen - share StripComments API Jaben Carsey
` (35 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
change methods which do not use self to @staticmethod
change their calls to use class name instead of instance
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenVar.py | 61 ++++++++++++--------
1 file changed, 36 insertions(+), 25 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index b9517d84c690..9d226d0f4567 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -71,7 +71,7 @@ class VariableMgr(object):
if not self.NVHeaderBuff:
return ""
self.NVHeaderBuff = self.NVHeaderBuff[:8] + pack("=Q",maxsize)
- default_var_bin = self.format_data(self.NVHeaderBuff + self.VarDefaultBuff + self.VarDeltaBuff)
+ default_var_bin = VariableMgr.format_data(self.NVHeaderBuff + self.VarDefaultBuff + self.VarDeltaBuff)
value_str = "{"
default_var_bin_strip = [ data.strip("""'""") for data in default_var_bin]
value_str += ",".join(default_var_bin_strip)
@@ -106,7 +106,7 @@ class VariableMgr(object):
value_list += [hex(unpack("B",data_byte)[0])]
newvalue[int(item.var_offset,16) if item.var_offset.upper().startswith("0X") else int(item.var_offset)] = value_list
try:
- newvaluestr = "{" + ",".join(self.assemble_variable(newvalue)) +"}"
+ newvaluestr = "{" + ",".join(VariableMgr.assemble_variable(newvalue)) +"}"
except:
EdkLogger.error("build", AUTOGEN_ERROR, "Variable offset conflict in PCDs: %s \n" % (" and ".join([item.pcdname for item in sku_var_info_offset_list])))
n = sku_var_info_offset_list[0]
@@ -151,7 +151,7 @@ class VariableMgr(object):
if len(default_sku_default.default_value.split(",")) < var_max_len:
tail = ",".join([ "0x00" for i in range(var_max_len-len(default_sku_default.default_value.split(",")))])
- default_data_buffer = self.PACK_VARIABLES_DATA(default_sku_default.default_value,default_sku_default.data_type,tail)
+ default_data_buffer = VariableMgr.PACK_VARIABLES_DATA(default_sku_default.default_value,default_sku_default.data_type,tail)
default_data_array = ()
for item in default_data_buffer:
@@ -169,13 +169,13 @@ class VariableMgr(object):
if len(other_sku_other.default_value.split(",")) < var_max_len:
tail = ",".join([ "0x00" for i in range(var_max_len-len(other_sku_other.default_value.split(",")))])
- others_data_buffer = self.PACK_VARIABLES_DATA(other_sku_other.default_value,other_sku_other.data_type,tail)
+ others_data_buffer = VariableMgr.PACK_VARIABLES_DATA(other_sku_other.default_value,other_sku_other.data_type,tail)
others_data_array = ()
for item in others_data_buffer:
others_data_array += unpack("B",item)
- data_delta = self.calculate_delta(default_data_array, others_data_array)
+ data_delta = VariableMgr.calculate_delta(default_data_array, others_data_array)
var_data[(skuid,defaultstoragename)][index] = (data_delta,sku_var_info[(skuid,defaultstoragename)])
return var_data
@@ -193,7 +193,7 @@ class VariableMgr(object):
var_data_offset = collections.OrderedDict()
offset = NvStorageHeaderSize
for default_data,default_info in pcds_default_data.values():
- var_name_buffer = self.PACK_VARIABLE_NAME(default_info.var_name)
+ var_name_buffer = VariableMgr.PACK_VARIABLE_NAME(default_info.var_name)
vendorguid = default_info.var_guid.split('-')
@@ -202,19 +202,19 @@ class VariableMgr(object):
else:
var_attr_value = 0x07
- DataBuffer = self.AlignData(var_name_buffer + default_data)
+ DataBuffer = VariableMgr.AlignData(var_name_buffer + default_data)
data_size = len(DataBuffer)
offset += VariableHeaderSize + len(default_info.var_name.split(","))
var_data_offset[default_info.pcdindex] = offset
offset += data_size - len(default_info.var_name.split(","))
- var_header_buffer = self.PACK_VARIABLE_HEADER(var_attr_value, len(default_info.var_name.split(",")), len (default_data), vendorguid)
+ var_header_buffer = VariableMgr.PACK_VARIABLE_HEADER(var_attr_value, len(default_info.var_name.split(",")), len (default_data), vendorguid)
NvStoreDataBuffer += (var_header_buffer + DataBuffer)
- variable_storage_header_buffer = self.PACK_VARIABLE_STORE_HEADER(len(NvStoreDataBuffer) + 28)
+ variable_storage_header_buffer = VariableMgr.PACK_VARIABLE_STORE_HEADER(len(NvStoreDataBuffer) + 28)
- nv_default_part = self.AlignData(self.PACK_DEFAULT_DATA(0, 0, self.unpack_data(variable_storage_header_buffer+NvStoreDataBuffer)), 8)
+ nv_default_part = VariableMgr.AlignData(VariableMgr.PACK_DEFAULT_DATA(0, 0, VariableMgr.unpack_data(variable_storage_header_buffer+NvStoreDataBuffer)), 8)
data_delta_structure_buffer = ""
for skuname,defaultstore in var_data:
@@ -228,29 +228,31 @@ class VariableMgr(object):
delta_data = [(item[0] + offset, item[1]) for item in delta_data]
delta_data_set.extend(delta_data)
- data_delta_structure_buffer += self.AlignData(self.PACK_DELTA_DATA(skuname,defaultstore,delta_data_set), 8)
+ data_delta_structure_buffer += VariableMgr.AlignData(self.PACK_DELTA_DATA(skuname,defaultstore,delta_data_set), 8)
size = len(nv_default_part + data_delta_structure_buffer) + 16
maxsize = self.VpdRegionSize if self.VpdRegionSize else size
- NV_Store_Default_Header = self.PACK_NV_STORE_DEFAULT_HEADER(size,maxsize)
+ NV_Store_Default_Header = VariableMgr.PACK_NV_STORE_DEFAULT_HEADER(size,maxsize)
self.NVHeaderBuff = NV_Store_Default_Header
self.VarDefaultBuff =nv_default_part
self.VarDeltaBuff = data_delta_structure_buffer
- return self.format_data(NV_Store_Default_Header + nv_default_part + data_delta_structure_buffer)
+ return VariableMgr.format_data(NV_Store_Default_Header + nv_default_part + data_delta_structure_buffer)
- def format_data(self,data):
+ @staticmethod
+ def format_data(data):
+ return [hex(item) for item in VariableMgr.unpack_data(data)]
- return [hex(item) for item in self.unpack_data(data)]
-
- def unpack_data(self,data):
+ @staticmethod
+ def unpack_data(data):
final_data = ()
for item in data:
final_data += unpack("B",item)
return final_data
- def calculate_delta(self, default, theother):
+ @staticmethod
+ def calculate_delta(default, theother):
if len(default) - len(theother) != 0:
EdkLogger.error("build", FORMAT_INVALID, 'The variable data length is not the same for the same PCD.')
data_delta = []
@@ -270,7 +272,8 @@ class VariableMgr(object):
return value_str
return ""
- def PACK_VARIABLE_STORE_HEADER(self,size):
+ @staticmethod
+ def PACK_VARIABLE_STORE_HEADER(size):
#Signature: gEfiVariableGuid
Guid = "{ 0xddcf3616, 0x3275, 0x4164, { 0x98, 0xb6, 0xfe, 0x85, 0x70, 0x7f, 0xfe, 0x7d }}"
Guid = GuidStructureStringToGuidString(Guid)
@@ -284,7 +287,8 @@ class VariableMgr(object):
return GuidBuffer + SizeBuffer + FormatBuffer + StateBuffer + reservedBuffer
- def PACK_NV_STORE_DEFAULT_HEADER(self,size,maxsize):
+ @staticmethod
+ def PACK_NV_STORE_DEFAULT_HEADER(size,maxsize):
Signature = pack('=B',ord('N'))
Signature += pack("=B",ord('S'))
Signature += pack("=B",ord('D'))
@@ -295,7 +299,8 @@ class VariableMgr(object):
return Signature + SizeBuffer + MaxSizeBuffer
- def PACK_VARIABLE_HEADER(self,attribute,namesize,datasize,vendorguid):
+ @staticmethod
+ def PACK_VARIABLE_HEADER(attribute,namesize,datasize,vendorguid):
Buffer = pack('=H',0x55AA) # pack StartID
Buffer += pack('=B',0x3F) # pack State
@@ -309,7 +314,8 @@ class VariableMgr(object):
return Buffer
- def PACK_VARIABLES_DATA(self, var_value,data_type, tail = None):
+ @staticmethod
+ def PACK_VARIABLES_DATA(var_value,data_type, tail = None):
Buffer = ""
data_len = 0
if data_type == DataType.TAB_VOID:
@@ -338,7 +344,8 @@ class VariableMgr(object):
return Buffer
- def PACK_DEFAULT_DATA(self, defaultstoragename,skuid,var_value):
+ @staticmethod
+ def PACK_DEFAULT_DATA(defaultstoragename,skuid,var_value):
Buffer = ""
Buffer += pack("=L",4+8+8)
Buffer += pack("=Q",int(skuid))
@@ -355,10 +362,12 @@ class VariableMgr(object):
if skuname not in self.SkuIdMap:
return None
return self.SkuIdMap.get(skuname)[0]
+
def GetDefaultStoreId(self,dname):
if dname not in self.DefaultStoreMap:
return None
return self.DefaultStoreMap.get(dname)[0]
+
def PACK_DELTA_DATA(self,skuname,defaultstoragename,delta_list):
skuid = self.GetSkuId(skuname)
defaultstorageid = self.GetDefaultStoreId(defaultstoragename)
@@ -374,7 +383,8 @@ class VariableMgr(object):
return Buffer
- def AlignData(self,data, align = 4):
+ @staticmethod
+ def AlignData(data, align = 4):
mybuffer = data
if (len(data) % align) > 0:
for i in range(align - (len(data) % align)):
@@ -382,7 +392,8 @@ class VariableMgr(object):
return mybuffer
- def PACK_VARIABLE_NAME(self, var_name):
+ @staticmethod
+ def PACK_VARIABLE_NAME(var_name):
Buffer = ""
for name_char in var_name.strip("{").strip("}").split(","):
Buffer += pack("=B",int(name_char,16))
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 08/42] BaseTools: AutoGen - share StripComments API
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (6 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 07/42] BaseTools: AutoGen - GenVar refactor static methods Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 09/42] BaseTools: AutoGen - refactor class factory Jaben Carsey
` (34 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
add the API root in one class file.
delete the static API out of both classes.
share it in the single location.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/IdfClassObject.py | 19 ++-----------
BaseTools/Source/Python/AutoGen/UniClassObject.py | 29 ++++++++++----------
2 files changed, 16 insertions(+), 32 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index 6953854a5247..82396d3744d5 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -22,6 +22,7 @@ from Common.LongFilePathSupport import LongFilePath
import re
import os
from Common.GlobalData import gIdentifierPattern
+from UniClassObject import StripComments
IMAGE_TOKEN = re.compile('IMAGE_TOKEN *\(([A-Z0-9_]+) *\)', re.MULTILINE | re.UNICODE)
@@ -91,7 +92,7 @@ class IdfFileClassObject(object):
ImageFileList = []
for Line in FileIn.splitlines():
Line = Line.strip()
- Line = self.StripComments(Line)
+ Line = StripComments(Line)
if len(Line) == 0:
continue
@@ -121,22 +122,6 @@ class IdfFileClassObject(object):
if ImageFileList:
self.ImageFilesDict[File] = ImageFileList
- def StripComments(self, Line):
- Comment = '//'
- CommentPos = Line.find(Comment)
- while CommentPos >= 0:
- # if there are non matched quotes before the comment header
- # then we are in the middle of a string
- # but we need to ignore the escaped quotes and backslashes.
- if ((Line.count('"', 0, CommentPos) - Line.count('\\"', 0, CommentPos)) & 1) == 1:
- CommentPos = Line.find (Comment, CommentPos + 1)
- else:
- return Line[:CommentPos].strip()
- return Line.strip()
-
- def ImageDecoder(self, File):
- pass
-
def SearchImageID(ImageFileObject, FileList):
if FileList == []:
return ImageFileObject
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 5b879d784d9c..4e16afec5c28 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -218,6 +218,19 @@ class StringDefClassObject(object):
self.StringValueByteList = UniToHexList(self.StringValue)
self.Length = len(self.StringValueByteList)
+def StripComments(Line):
+ Comment = u'//'
+ CommentPos = Line.find(Comment)
+ while CommentPos >= 0:
+ # if there are non matched quotes before the comment header
+ # then we are in the middle of a string
+ # but we need to ignore the escaped quotes and backslashes.
+ if ((Line.count(u'"', 0, CommentPos) - Line.count(u'\\"', 0, CommentPos)) & 1) == 1:
+ CommentPos = Line.find (Comment, CommentPos + 1)
+ else:
+ return Line[:CommentPos].strip()
+ return Line.strip()
+
## UniFileClassObject
#
# A structure for .uni file definition
@@ -371,20 +384,6 @@ class UniFileClassObject(object):
FileName = Item[Item.find(u'#include ') + len(u'#include ') :Item.find(u' ', len(u'#include '))][1:-1]
self.LoadUniFile(FileName)
- def StripComments(self, Line):
- Comment = u'//'
- CommentPos = Line.find(Comment)
- while CommentPos >= 0:
- # if there are non matched quotes before the comment header
- # then we are in the middle of a string
- # but we need to ignore the escaped quotes and backslashes.
- if ((Line.count(u'"', 0, CommentPos) - Line.count(u'\\"', 0, CommentPos)) & 1) == 1:
- CommentPos = Line.find (Comment, CommentPos + 1)
- else:
- return Line[:CommentPos].strip()
- return Line.strip()
-
-
#
# Pre-process before parse .uni file
#
@@ -406,7 +405,7 @@ class UniFileClassObject(object):
for Line in FileIn:
Line = Line.strip()
Line = Line.replace(u'\\\\', BACK_SLASH_PLACEHOLDER)
- Line = self.StripComments(Line)
+ Line = StripComments(Line)
#
# Ignore empty line
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 09/42] BaseTools: AutoGen - refactor class factory
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (7 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 08/42] BaseTools: AutoGen - share StripComments API Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 10/42] BaseTools: Eot - remove unused lists Jaben Carsey
` (33 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
since instances are not added to cache, the factory does nothing.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/BuildEngine.py | 23 ++++----------------
1 file changed, 4 insertions(+), 19 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index bbd1a4d5b257..2c823797d7c5 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -47,21 +47,10 @@ def ListFileMacro(FileType):
return "%s_LIST" % FileListMacro(FileType)
class TargetDescBlock(object):
- _Cache_ = {} # {TargetFile : TargetDescBlock object}
+ def __init__(self, Inputs, Outputs, Commands, Dependencies):
+ self.InitWorker(Inputs, Outputs, Commands, Dependencies)
- # Factory method
- def __new__(Class, Inputs, Outputs, Commands, Dependencies):
- if Outputs[0] in Class._Cache_:
- Tdb = Class._Cache_[Outputs[0]]
- for File in Inputs:
- Tdb.AddInput(File)
- else:
- Tdb = super(TargetDescBlock, Class).__new__(Class)
- Tdb._Init(Inputs, Outputs, Commands, Dependencies)
- #Class._Cache_[Outputs[0]] = Tdb
- return Tdb
-
- def _Init(self, Inputs, Outputs, Commands, Dependencies):
+ def InitWorker(self, Inputs, Outputs, Commands, Dependencies):
self.Inputs = Inputs
self.Outputs = Outputs
self.Commands = Commands
@@ -90,10 +79,6 @@ class TargetDescBlock(object):
def IsMultipleInput(self):
return len(self.Inputs) > 1
- @staticmethod
- def Renew():
- TargetDescBlock._Cache_ = {}
-
## Class for one build rule
#
# This represents a build rule which can give out corresponding command list for
@@ -278,7 +263,7 @@ class FileBuildRule:
# Command line should be regenerated since some macros are different
#
CommandList = self._BuildCommand(BuildRulePlaceholderDict)
- TargetDesc._Init([SourceFile], DstFile, CommandList, self.ExtraSourceFileList)
+ TargetDesc.InitWorker([SourceFile], DstFile, CommandList, self.ExtraSourceFileList)
break
else:
TargetDesc.AddInput(SourceFile)
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 10/42] BaseTools: Eot - remove unused lists
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (8 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 09/42] BaseTools: AutoGen - refactor class factory Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 11/42] BaseTools: Eot - refactor global data Jaben Carsey
` (32 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/Eot/EotGlobalData.py | 29 --------------------
1 file changed, 29 deletions(-)
diff --git a/BaseTools/Source/Python/Eot/EotGlobalData.py b/BaseTools/Source/Python/Eot/EotGlobalData.py
index 7689b76da9d6..da224a7ee3e3 100644
--- a/BaseTools/Source/Python/Eot/EotGlobalData.py
+++ b/BaseTools/Source/Python/Eot/EotGlobalData.py
@@ -104,32 +104,3 @@ gConsumedProtocolLibrary['EfiHandleProtocol'] = 1
# Dict for callback PROTOCOL function callling
gCallbackProtocolLibrary = OrderedDict()
gCallbackProtocolLibrary['EfiRegisterProtocolCallback'] = 2
-
-# Dict for ARCH PROTOCOL
-gArchProtocols = ['gEfiBdsArchProtocolGuid',
- 'gEfiCapsuleArchProtocolGuid',
- 'gEfiCpuArchProtocolGuid', #5053697e-2cbc-4819-90d9-0580deee5754
- 'gEfiMetronomeArchProtocolGuid',
- 'gEfiMonotonicCounterArchProtocolGuid',
- 'gEfiRealTimeClockArchProtocolGuid',
- 'gEfiResetArchProtocolGuid',
- 'gEfiRuntimeArchProtocolGuid',
- 'gEfiSecurityArchProtocolGuid',
- 'gEfiStatusCodeRuntimeProtocolGuid',
- 'gEfiTimerArchProtocolGuid',
- 'gEfiVariableArchProtocolGuid',
- 'gEfiVariableWriteArchProtocolGuid',
- 'gEfiWatchdogTimerArchProtocolGuid']
-gArchProtocolGuids = ['665e3ff6-46cc-11d4-9a38-0090273fc14d',
- '26baccb1-6f42-11d4-bce7-0080c73c8881',
- '26baccb2-6f42-11d4-bce7-0080c73c8881',
- '1da97072-bddc-4b30-99f1-72a0b56fff2a',
- '27cfac87-46cc-11d4-9a38-0090273fc14d',
- '27cfac88-46cc-11d4-9a38-0090273fc14d',
- 'b7dfb4e1-052f-449f-87be-9818fc91b733',
- 'a46423e3-4617-49f1-b9ff-d1bfa9115839',
- 'd2b2b828-0826-48a7-b3df-983c006024f0',
- '26baccb3-6f42-11d4-bce7-0080c73c8881',
- '1e5668e2-8481-11d4-bcf1-0080c73c8881',
- '6441f818-6362-4e44-b570-7dba31dd2453',
- '665e3ff5-46cc-11d4-9a38-0090273fc14d']
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 11/42] BaseTools: Eot - refactor global data
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (9 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 10/42] BaseTools: Eot - remove unused lists Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 12/42] BaseTools: AutoGen - remove global line Jaben Carsey
` (31 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
remove unused lists, dicts, and duplicate variables
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/Eot/Eot.py | 5 -----
BaseTools/Source/Python/Eot/EotGlobalData.py | 12 ------------
2 files changed, 17 deletions(-)
diff --git a/BaseTools/Source/Python/Eot/Eot.py b/BaseTools/Source/Python/Eot/Eot.py
index 15de822d69c2..fcde8fd3e22f 100644
--- a/BaseTools/Source/Python/Eot/Eot.py
+++ b/BaseTools/Source/Python/Eot/Eot.py
@@ -302,9 +302,6 @@ class Eot(object):
EotGlobalData.gINF_FILES = mFileList
EotGlobalData.gOP_INF.close()
- EotGlobalData.gDEC_FILES = mDecFileList
-
-
## GenerateReport() method
#
# Generate final HTML report
@@ -393,8 +390,6 @@ class Eot(object):
SqlCommand = """select DISTINCT GuidValue, ItemType from Report where ModuleID = -2 and ItemMode = 'Produced'"""
RecordSet = EotGlobalData.gDb.TblReport.Exec(SqlCommand)
for Record in RecordSet:
- if Record[1] == 'Ppi':
- EotGlobalData.gPpiList[Record[0].lower()] = -2
if Record[1] == 'Protocol':
EotGlobalData.gProtocolList[Record[0].lower()] = -2
diff --git a/BaseTools/Source/Python/Eot/EotGlobalData.py b/BaseTools/Source/Python/Eot/EotGlobalData.py
index da224a7ee3e3..a9f51189c1eb 100644
--- a/BaseTools/Source/Python/Eot/EotGlobalData.py
+++ b/BaseTools/Source/Python/Eot/EotGlobalData.py
@@ -36,11 +36,6 @@ gMACRO['EDK_SOURCE'] = gEDK_SOURCE
gMACRO['SHELL_INF'] = gSHELL_INF
gMACRO['CAPSULE_INF'] = ''
-gNOT_FOUND_FILES = []
-gSOURCE_FILES = []
-gINF_FILES = {}
-gDEC_FILES = []
-
# Log file for unmatched variables
gUN_MATCHED_LOG = 'Log_UnMatched.log'
gOP_UN_MATCHED = open(gUN_MATCHED_LOG, 'w+')
@@ -61,10 +56,6 @@ gOP_UN_MATCHED_IN_LIBRARY_CALLING = open(gUN_MATCHED_IN_LIBRARY_CALLING_LOG, 'w+
gDISPATCH_ORDER_LOG = 'Log_DispatchOrder.log'
gOP_DISPATCH_ORDER = open(gDISPATCH_ORDER_LOG, 'w+')
-# Log file for source files not found
-gUN_FOUND_FILES = 'Log_UnFoundSourceFiles.log'
-gOP_UN_FOUND_FILES = open(gUN_FOUND_FILES, 'w+')
-
# Log file for found source files
gSOURCE_FILES = 'Log_SourceFiles.log'
gOP_SOURCE_FILES = open(gSOURCE_FILES, 'w+')
@@ -72,9 +63,6 @@ gOP_SOURCE_FILES = open(gSOURCE_FILES, 'w+')
# Dict for GUID found in DEC files
gGuidDict = dict()
-# Dict for PPI
-gPpiList = {}
-
# Dict for PROTOCOL
gProtocolList = {}
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 12/42] BaseTools: AutoGen - remove global line
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (10 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 11/42] BaseTools: Eot - refactor global data Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 13/42] BaseTools: AutoGen - UniClassObject refactor static methods Jaben Carsey
` (30 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel
this serves no purpose since we dont change the global or assign to it.
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/UniClassObject.py | 2 --
1 file changed, 2 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 4e16afec5c28..aa97f19e55b4 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -118,8 +118,6 @@ LangConvTable = {'eng':'en', 'fra':'fr', \
# @retval LangName: Valid lanugage code in RFC 4646 format or None
#
def GetLanguageCode(LangName, IsCompatibleMode, File):
- global LangConvTable
-
length = len(LangName)
if IsCompatibleMode:
if length == 3 and LangName.isalpha():
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 13/42] BaseTools: AutoGen - UniClassObject refactor static methods
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (11 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 12/42] BaseTools: AutoGen - remove global line Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 14/42] BaseTools: refactor to use list not dict Jaben Carsey
` (29 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
change methods which do not use self to @staticmethod
change their calls to use class name instead of instance
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/UniClassObject.py | 12 +++++++-----
1 file changed, 7 insertions(+), 5 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index aa97f19e55b4..54b6fb22a08a 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -253,7 +253,7 @@ class UniFileClassObject(object):
Lang = distutils.util.split_quoted((Line.split(u"//")[0]))
if len(Lang) != 3:
try:
- FileIn = self.OpenUniFile(LongFilePath(File.Path))
+ FileIn = UniFileClassObject.OpenUniFile(LongFilePath(File.Path))
except UnicodeError, X:
EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File);
except:
@@ -297,7 +297,8 @@ class UniFileClassObject(object):
self.OrderedStringDict[LangName][Item.StringName] = len(self.OrderedStringList[LangName]) - 1
return True
- def OpenUniFile(self, FileName):
+ @staticmethod
+ def OpenUniFile(FileName):
#
# Read file
#
@@ -316,14 +317,15 @@ class UniFileClassObject(object):
FileIn.startswith(codecs.BOM_UTF16_LE)):
Encoding = 'utf-16'
- self.VerifyUcs2Data(FileIn, FileName, Encoding)
+ UniFileClassObject.VerifyUcs2Data(FileIn, FileName, Encoding)
UniFile = StringIO.StringIO(FileIn)
Info = codecs.lookup(Encoding)
(Reader, Writer) = (Info.streamreader, Info.streamwriter)
return codecs.StreamReaderWriter(UniFile, Reader, Writer)
- def VerifyUcs2Data(self, FileIn, FileName, Encoding):
+ @staticmethod
+ def VerifyUcs2Data(FileIn, FileName, Encoding):
Ucs2Info = codecs.lookup('ucs-2')
#
# Convert to unicode
@@ -390,7 +392,7 @@ class UniFileClassObject(object):
EdkLogger.error("Unicode File Parser", FILE_NOT_FOUND, ExtraData=File.Path)
try:
- FileIn = self.OpenUniFile(LongFilePath(File.Path))
+ FileIn = UniFileClassObject.OpenUniFile(LongFilePath(File.Path))
except UnicodeError, X:
EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File.Path);
except:
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 14/42] BaseTools: refactor to use list not dict
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (12 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 13/42] BaseTools: AutoGen - UniClassObject refactor static methods Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 15/42] BaseTools: eliminate {} from dictionary contructor call Jaben Carsey
` (28 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
since we never access the values in the copied dict, just use a list instead.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/Common/ToolDefClassObject.py | 5 +++--
BaseTools/Source/Python/GenFds/GenFds.py | 2 +-
2 files changed, 4 insertions(+), 3 deletions(-)
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index 73ebdaf6b179..49b24ef780c7 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -92,7 +92,9 @@ class ToolDefClassObject(object):
KeyList = [TAB_TOD_DEFINES_TARGET, TAB_TOD_DEFINES_TOOL_CHAIN_TAG, TAB_TOD_DEFINES_TARGET_ARCH, TAB_TOD_DEFINES_COMMAND_TYPE]
for Index in range(3, -1, -1):
- for Key in dict(self.ToolsDefTxtDictionary):
+ # make a copy of the keys to enumerate over to prevent issues when
+ # adding/removing items from the original dict.
+ for Key in list(self.ToolsDefTxtDictionary.keys()):
List = Key.split('_')
if List[Index] == '*':
for String in self.ToolsDefTxtDatabase[KeyList[Index]]:
@@ -100,7 +102,6 @@ class ToolDefClassObject(object):
NewKey = '%s_%s_%s_%s_%s' % tuple(List)
if NewKey not in self.ToolsDefTxtDictionary:
self.ToolsDefTxtDictionary[NewKey] = self.ToolsDefTxtDictionary[Key]
- continue
del self.ToolsDefTxtDictionary[Key]
elif List[Index] not in self.ToolsDefTxtDatabase[KeyList[Index]]:
del self.ToolsDefTxtDictionary[Key]
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 54c7d828305f..74017e72629b 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -428,7 +428,7 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
if BuildOption:
ToolList = [TAB_TOD_DEFINES_TARGET, TAB_TOD_DEFINES_TOOL_CHAIN_TAG, TAB_TOD_DEFINES_TARGET_ARCH]
for Index in range(2, -1, -1):
- for Key in dict(BuildOption):
+ for Key in list(BuildOption.keys()):
List = Key.split('_')
if List[Index] == '*':
for String in ToolDb[ToolList[Index]]:
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 15/42] BaseTools: eliminate {} from dictionary contructor call
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (13 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 14/42] BaseTools: refactor to use list not dict Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 16/42] BaseTools: remove Compound statements Jaben Carsey
` (27 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
no need to construct 2 dictionaries.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/Workspace/BuildClassObject.py | 10 +++++-----
1 file changed, 5 insertions(+), 5 deletions(-)
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index 226277a45130..8b3e2ec29973 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -123,16 +123,16 @@ class StructurePcd(PcdClassObject):
self.StructuredPcdIncludeFile = [] if StructuredPcdIncludeFile is None else StructuredPcdIncludeFile
self.PackageDecs = Packages
self.DefaultStoreName = [default_store]
- self.DefaultValues = collections.OrderedDict({})
+ self.DefaultValues = collections.OrderedDict()
self.PcdMode = None
- self.SkuOverrideValues = collections.OrderedDict({})
+ self.SkuOverrideValues = collections.OrderedDict()
self.FlexibleFieldName = None
self.StructName = None
self.PcdDefineLineNo = 0
self.PkgPath = ""
self.DefaultValueFromDec = ""
self.ValueChain = dict()
- self.PcdFieldValueFromComm = collections.OrderedDict({})
+ self.PcdFieldValueFromComm = collections.OrderedDict()
def __repr__(self):
return self.TypeName
@@ -146,9 +146,9 @@ class StructurePcd(PcdClassObject):
self.DefaultValueFromDec = DefaultValue
def AddOverrideValue (self, FieldName, Value, SkuName, DefaultStoreName, FileName="", LineNo=0):
if SkuName not in self.SkuOverrideValues:
- self.SkuOverrideValues[SkuName] = collections.OrderedDict({})
+ self.SkuOverrideValues[SkuName] = collections.OrderedDict()
if DefaultStoreName not in self.SkuOverrideValues[SkuName]:
- self.SkuOverrideValues[SkuName][DefaultStoreName] = collections.OrderedDict({})
+ self.SkuOverrideValues[SkuName][DefaultStoreName] = collections.OrderedDict()
if FieldName in self.SkuOverrideValues[SkuName][DefaultStoreName]:
del self.SkuOverrideValues[SkuName][DefaultStoreName][FieldName]
self.SkuOverrideValues[SkuName][DefaultStoreName][FieldName] = [Value.strip(), FileName, LineNo]
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 16/42] BaseTools: remove Compound statements
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (14 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 15/42] BaseTools: eliminate {} from dictionary contructor call Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 17/42] BaseTools: Workspace - refactor a dict Jaben Carsey
` (26 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
split them into 2 seperate lines.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/Workspace/BuildClassObject.py | 15 ++++++++++-----
1 file changed, 10 insertions(+), 5 deletions(-)
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index 8b3e2ec29973..6ca3cd9da22b 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -114,11 +114,16 @@ class PcdClassObject(object):
class StructurePcd(PcdClassObject):
def __init__(self, StructuredPcdIncludeFile=None, Packages=None, Name=None, Guid=None, Type=None, DatumType=None, Value=None, Token=None, MaxDatumSize=None, SkuInfoList=None, IsOverrided=False, GuidValue=None, validateranges=None, validlists=None, expressions=None,default_store = TAB_DEFAULT_STORES_DEFAULT):
- if SkuInfoList is None: SkuInfoList={}
- if validateranges is None: validateranges=[]
- if validlists is None: validlists=[]
- if expressions is None : expressions=[]
- if Packages is None : Packages = []
+ if SkuInfoList is None:
+ SkuInfoList = {}
+ if validateranges is None:
+ validateranges = []
+ if validlists is None:
+ validlists = []
+ if expressions is None:
+ expressions = []
+ if Packages is None:
+ Packages = []
super(StructurePcd, self).__init__(Name, Guid, Type, DatumType, Value, Token, MaxDatumSize, SkuInfoList, IsOverrided, GuidValue, validateranges, validlists, expressions)
self.StructuredPcdIncludeFile = [] if StructuredPcdIncludeFile is None else StructuredPcdIncludeFile
self.PackageDecs = Packages
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 17/42] BaseTools: Workspace - refactor a dict
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (15 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 16/42] BaseTools: remove Compound statements Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 18/42] BaseTools: move PCD size calculation functions to PcdClassObject Jaben Carsey
` (25 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
change a dict to a set since we never examine the contents, just the keys.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/Workspace/BuildClassObject.py | 2 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 4 ++--
2 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index 6ca3cd9da22b..258905e80f25 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -136,7 +136,7 @@ class StructurePcd(PcdClassObject):
self.PcdDefineLineNo = 0
self.PkgPath = ""
self.DefaultValueFromDec = ""
- self.ValueChain = dict()
+ self.ValueChain = set()
self.PcdFieldValueFromComm = collections.OrderedDict()
def __repr__(self):
return self.TypeName
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 335267ebc576..8ebac8957a1b 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1347,7 +1347,7 @@ class DscBuildData(PlatformBuildClassObject):
nextskuid = self.SkuIdMgr.GetNextSkuId(nextskuid)
stru_pcd.SkuOverrideValues[skuid] = copy.deepcopy(stru_pcd.SkuOverrideValues[nextskuid]) if not NoDefault else copy.deepcopy({defaultstorename: stru_pcd.DefaultValues for defaultstorename in DefaultStores} if DefaultStores else {TAB_DEFAULT_STORES_DEFAULT:stru_pcd.DefaultValues})
if not NoDefault:
- stru_pcd.ValueChain[(skuid,'')]= (nextskuid,'')
+ stru_pcd.ValueChain.add(skuid,'')
if stru_pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
for skuid in SkuIds:
nextskuid = skuid
@@ -1366,7 +1366,7 @@ class DscBuildData(PlatformBuildClassObject):
for defaultstoreid in DefaultStores:
if defaultstoreid not in stru_pcd.SkuOverrideValues[skuid]:
stru_pcd.SkuOverrideValues[skuid][defaultstoreid] = copy.deepcopy(stru_pcd.SkuOverrideValues[nextskuid][mindefaultstorename])
- stru_pcd.ValueChain[(skuid,defaultstoreid)]= (nextskuid,mindefaultstorename)
+ stru_pcd.ValueChain.add(skuid,defaultstoreid)
S_pcd_set = DscBuildData.OverrideByFdfComm(S_pcd_set)
Str_Pcd_Values = self.GenerateByteArrayValue(S_pcd_set)
if Str_Pcd_Values:
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 18/42] BaseTools: move PCD size calculation functions to PcdClassObject
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (16 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 17/42] BaseTools: Workspace - refactor a dict Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 19/42] BaseTools: AutoGen - refactor out functions only called in __init__ Jaben Carsey
` (24 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
move both GetPcdMaxSize and GetPcdSize to the PcdClassObject.
fix MAX_SIZE_TYPE to have int values
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenC.py | 37 ++++----------------
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 2 +-
BaseTools/Source/Python/Common/DataType.py | 2 +-
BaseTools/Source/Python/Common/VpdInfoFile.py | 2 +-
BaseTools/Source/Python/Workspace/BuildClassObject.py | 30 ++++++++++++++++
BaseTools/Source/Python/Workspace/DscBuildData.py | 19 +---------
6 files changed, 40 insertions(+), 52 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 73d7699ad01b..4e7e3d90be64 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -868,31 +868,6 @@ def DynExPcdTokenNumberMapping(Info, AutoGenH):
% (RealTokenCName, RealTokenCName, RealTokenCName, RealTokenCName))
TokenCNameList.add(TokenCName)
-def GetPcdSize(Pcd):
- if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
- Value = Pcd.DefaultValue
- if Value in [None, '']:
- return 1
- elif Value[0] == 'L':
- return (len(Value) - 2) * 2
- elif Value[0] == '{':
- return len(Value.split(','))
- else:
- return len(Value) - 1
- if Pcd.DatumType == TAB_UINT64:
- return 8
- if Pcd.DatumType == TAB_UINT32:
- return 4
- if Pcd.DatumType == TAB_UINT16:
- return 2
- if Pcd.DatumType == TAB_UINT8:
- return 1
- if Pcd.DatumType == 'BOOLEAN':
- return 1
- else:
- return Pcd.MaxDatumSize
-
-
## Create code for module PCDs
#
# @param Info The ModuleAutoGen object
@@ -1115,7 +1090,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
"The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidCName, TokenCName),
ExtraData="[%s]" % str(Info))
else:
- ArraySize = GetPcdSize(Pcd)
+ ArraySize = Pcd.GetPcdSize()
if Unicode:
ArraySize = ArraySize / 2
Value = NewValue + '0 }'
@@ -1155,7 +1130,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
AutoGenH.Append('extern %s UINT8 %s%s;\n' %(Const, PcdVariableName, Array))
AutoGenH.Append('#define %s %s%s\n' %(GetModeName, Type, PcdVariableName))
- PcdDataSize = GetPcdSize(Pcd)
+ PcdDataSize = Pcd.GetPcdSize()
if Pcd.Type == TAB_PCDS_FIXED_AT_BUILD:
AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
AutoGenH.Append('#define %s %s \n' % (GetModeSizeName,FixPcdSizeTokenName))
@@ -1172,14 +1147,14 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
AutoGenH.Append('extern volatile %s %s %s%s;\n' % (Const, Pcd.DatumType, PcdVariableName, Array))
AutoGenH.Append('#define %s %s%s\n' % (GetModeName, Type, PcdVariableName))
- PcdDataSize = GetPcdSize(Pcd)
+ PcdDataSize = Pcd.GetPcdSize()
AutoGenH.Append('#define %s %s\n' % (PatchPcdSizeTokenName, PcdDataSize))
AutoGenH.Append('#define %s %s \n' % (GetModeSizeName,PatchPcdSizeVariableName))
AutoGenH.Append('extern UINTN %s; \n' % PatchPcdSizeVariableName)
AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (PatchPcdSizeVariableName,PcdDataSize))
else:
- PcdDataSize = GetPcdSize(Pcd)
+ PcdDataSize = Pcd.GetPcdSize()
AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
AutoGenH.Append('#define %s %s \n' % (GetModeSizeName,FixPcdSizeTokenName))
@@ -1338,7 +1313,7 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
else:
AutoGenH.Append('extern volatile %s %s%s;\n' % (DatumType, PcdVariableName, Array))
AutoGenH.Append('#define %s %s_gPcd_BinaryPatch_%s\n' %(GetModeName, Type, TokenCName))
- PcdDataSize = GetPcdSize(Pcd)
+ PcdDataSize = Pcd.GetPcdSize()
if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
AutoGenH.Append('#define %s(SizeOfBuffer, Buffer) LibPatchPcdSetPtrAndSize((VOID *)_gPcd_BinaryPatch_%s, &%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, TokenCName, PatchPcdSizeVariableName, PatchPcdMaxSizeVariable))
AutoGenH.Append('#define %s(SizeOfBuffer, Buffer) LibPatchPcdSetPtrAndSizeS((VOID *)_gPcd_BinaryPatch_%s, &%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, TokenCName, PatchPcdSizeVariableName, PatchPcdMaxSizeVariable))
@@ -1372,7 +1347,7 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
AutoGenH.Append('#define _PCD_VALUE_%s %s%s\n' %(TokenCName, Type, PcdVariableName))
else:
AutoGenH.Append('#define _PCD_VALUE_%s %s\n' %(TokenCName, Pcd.DefaultValue))
- PcdDataSize = GetPcdSize(Pcd)
+ PcdDataSize = Pcd.GetPcdSize()
if PcdItemType == TAB_PCDS_FIXED_AT_BUILD:
if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
if ConstFixedPcd:
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index dd78dc520075..2f8f4fac23f8 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -253,7 +253,7 @@ class VAR_CHECK_PCD_VALID_OBJ(object):
def updateStorageWidth(self):
try:
- self.StorageWidth = int(MAX_SIZE_TYPE[self.PcdDataType])
+ self.StorageWidth = MAX_SIZE_TYPE[self.PcdDataType]
except:
self.StorageWidth = 0
self.ValidData = False
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index 20f31ce4b72f..8af94354620c 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -278,7 +278,7 @@ TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_LIST = [TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_
## The mapping dictionary from datum type to its maximum number.
MAX_VAL_TYPE = {"BOOLEAN":0x01, TAB_UINT8:0xFF, TAB_UINT16:0xFFFF, TAB_UINT32:0xFFFFFFFF, TAB_UINT64:0xFFFFFFFFFFFFFFFF}
## The mapping dictionary from datum type to size string.
-MAX_SIZE_TYPE = {"BOOLEAN":"1", TAB_UINT8:"1", TAB_UINT16:"2", TAB_UINT32:"4", TAB_UINT64:"8"}
+MAX_SIZE_TYPE = {"BOOLEAN":1, TAB_UINT8:1, TAB_UINT16:2, TAB_UINT32:4, TAB_UINT64:8}
TAB_DEPEX = 'Depex'
TAB_DEPEX_COMMON = TAB_DEPEX + TAB_SPLIT + TAB_ARCH_COMMON
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 155693740f12..32895deb5d0c 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -99,7 +99,7 @@ class VpdInfoFile:
EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
"Invalid max datum size for VPD PCD %s.%s" % (Vpd.TokenSpaceGuidCName, Vpd.TokenCName))
elif Vpd.DatumType in TAB_PCD_NUMERIC_TYPES:
- if Vpd.MaxDatumSize is None or Vpd.MaxDatumSize == "":
+ if not Vpd.MaxDatumSize:
Vpd.MaxDatumSize = MAX_SIZE_TYPE[Vpd.DatumType]
else:
if Vpd.MaxDatumSize <= 0:
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index 258905e80f25..52725e968226 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -72,6 +72,36 @@ class PcdClassObject(object):
self.PcdValueFromComm = ""
self.DefinitionPosition = ("","")
+ ## Get the maximum number of bytes
+ def GetPcdMaxSize(self):
+ if self.DatumType in TAB_PCD_NUMERIC_TYPES:
+ return MAX_SIZE_TYPE[self.DatumType]
+
+ MaxSize = int(self.MaxDatumSize,10) if self.MaxDatumSize else 0
+ if self.PcdValueFromComm:
+ if self.PcdValueFromComm.startswith("{") and self.PcdValueFromComm.endswith("}"):
+ return max([len(self.PcdValueFromComm.split(",")),MaxSize])
+ elif self.PcdValueFromComm.startswith("\"") or self.PcdValueFromComm.startswith("\'"):
+ return max([len(self.PcdValueFromComm)-2+1,MaxSize])
+ elif self.PcdValueFromComm.startswith("L\""):
+ return max([2*(len(self.PcdValueFromComm)-3+1),MaxSize])
+ else:
+ return max([len(self.PcdValueFromComm),MaxSize])
+ return MaxSize
+
+ ## Get the number of bytes
+ def GetPcdSize(self):
+ if self.DatumType in TAB_PCD_NUMERIC_TYPES:
+ return MAX_SIZE_TYPE[self.DatumType]
+ if not self.DefaultValue:
+ return 1
+ elif self.DefaultValue[0] == 'L':
+ return (len(self.DefaultValue) - 2) * 2
+ elif self.DefaultValue[0] == '{':
+ return len(self.DefaultValue.split(','))
+ else:
+ return len(self.DefaultValue) - 1
+
## Convert the class to a string
#
# Convert each member of the class to string
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 8ebac8957a1b..13a1ed886cc4 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1535,23 +1535,6 @@ class DscBuildData(PlatformBuildClassObject):
Result = Result + '"'
return Result
- @staticmethod
- def GetPcdMaxSize(Pcd):
- if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES:
- return MAX_SIZE_TYPE[Pcd.DatumType]
-
- MaxSize = int(Pcd.MaxDatumSize,10) if Pcd.MaxDatumSize else 0
- if Pcd.PcdValueFromComm:
- if Pcd.PcdValueFromComm.startswith("{") and Pcd.PcdValueFromComm.endswith("}"):
- return max([len(Pcd.PcdValueFromComm.split(",")),MaxSize])
- elif Pcd.PcdValueFromComm.startswith("\"") or Pcd.PcdValueFromComm.startswith("\'"):
- return max([len(Pcd.PcdValueFromComm)-2+1,MaxSize])
- elif Pcd.PcdValueFromComm.startswith("L\""):
- return max([2*(len(Pcd.PcdValueFromComm)-3+1),MaxSize])
- else:
- return max([len(Pcd.PcdValueFromComm),MaxSize])
- return MaxSize
-
def GenerateSizeFunction(self,Pcd):
CApp = "// Default Value in Dec \n"
CApp = CApp + "void Cal_%s_%s_Size(UINT32 *Size){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
@@ -1634,7 +1617,7 @@ class DscBuildData(PlatformBuildClassObject):
while '[' in FieldName:
FieldName = FieldName.rsplit('[', 1)[0]
CApp = CApp + ' __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %d Value %s \n' % (Pcd.DatumType, FieldName.strip("."), ArrayIndex + 1, Pcd.PcdFieldValueFromComm[FieldName_ori][1], Pcd.PcdFieldValueFromComm[FieldName_ori][2], Pcd.PcdFieldValueFromComm[FieldName_ori][0])
- CApp = CApp + " *Size = (%d > *Size ? %d : *Size); // The Pcd maxsize is %d \n" % (DscBuildData.GetPcdMaxSize(Pcd),DscBuildData.GetPcdMaxSize(Pcd),DscBuildData.GetPcdMaxSize(Pcd))
+ CApp = CApp + " *Size = (%d > *Size ? %d : *Size); // The Pcd maxsize is %d \n" % (Pcd.GetPcdMaxSize(),Pcd.GetPcdMaxSize(),Pcd.GetPcdMaxSize())
CApp = CApp + "}\n"
return CApp
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 19/42] BaseTools: AutoGen - refactor out functions only called in __init__
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (17 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 18/42] BaseTools: move PCD size calculation functions to PcdClassObject Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 20/42] BaseTools: AutoGen - refactor out a list Jaben Carsey
` (23 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 16 +---------------
1 file changed, 1 insertion(+), 15 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 2f8f4fac23f8..0b4677b62a73 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -244,16 +244,12 @@ class VAR_CHECK_PCD_VALID_OBJ(object):
self.Type = 1
self.Length = 0 # Length include this header
self.VarOffset = VarOffset
- self.StorageWidth = 0
self.PcdDataType = PcdDataType.strip()
self.rawdata = data
self.data = set()
- self.ValidData = True
- self.updateStorageWidth()
-
- def updateStorageWidth(self):
try:
self.StorageWidth = MAX_SIZE_TYPE[self.PcdDataType]
+ self.ValidData = True
except:
self.StorageWidth = 0
self.ValidData = False
@@ -265,9 +261,6 @@ class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
def __init__(self, VarOffset, validlist, PcdDataType):
super(VAR_CHECK_PCD_VALID_LIST, self).__init__(VarOffset, validlist, PcdDataType)
self.Type = 1
- self.update_data()
- self.update_size()
- def update_data(self):
valid_num_list = []
data_list = []
for item in self.rawdata:
@@ -283,8 +276,6 @@ class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
self.data = set(data_list)
-
- def update_size(self):
self.Length = 5 + len(self.data) * self.StorageWidth
@@ -292,9 +283,6 @@ class VAR_CHECK_PCD_VALID_RANGE(VAR_CHECK_PCD_VALID_OBJ):
def __init__(self, VarOffset, validrange, PcdDataType):
super(VAR_CHECK_PCD_VALID_RANGE, self).__init__(VarOffset, validrange, PcdDataType)
self.Type = 2
- self.update_data()
- self.update_size()
- def update_data(self):
RangeExpr = ""
data_list = []
i = 0
@@ -308,8 +296,6 @@ class VAR_CHECK_PCD_VALID_RANGE(VAR_CHECK_PCD_VALID_OBJ):
for obj in rangelist.pop():
data_list.append((obj.start, obj.end))
self.data = set(data_list)
-
- def update_size(self):
self.Length = 5 + len(self.data) * 2 * self.StorageWidth
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 20/42] BaseTools: AutoGen - refactor out a list
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (18 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 19/42] BaseTools: AutoGen - refactor out functions only called in __init__ Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 21/42] BaseTools: AutoGen - refactor out a useless class Jaben Carsey
` (22 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
the lists were used in __init__ then converted to sets
instead just use the sets from the begining
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 10 +++-------
1 file changed, 3 insertions(+), 7 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 0b4677b62a73..b491b68f6e87 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -262,7 +262,6 @@ class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
super(VAR_CHECK_PCD_VALID_LIST, self).__init__(VarOffset, validlist, PcdDataType)
self.Type = 1
valid_num_list = []
- data_list = []
for item in self.rawdata:
valid_num_list.extend(item.split(','))
@@ -270,12 +269,11 @@ class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
valid_num = valid_num.strip()
if valid_num.startswith('0x') or valid_num.startswith('0X'):
- data_list.append(int(valid_num, 16))
+ self.data.add(int(valid_num, 16))
else:
- data_list.append(int(valid_num))
+ self.data.add(int(valid_num))
- self.data = set(data_list)
self.Length = 5 + len(self.data) * self.StorageWidth
@@ -284,7 +282,6 @@ class VAR_CHECK_PCD_VALID_RANGE(VAR_CHECK_PCD_VALID_OBJ):
super(VAR_CHECK_PCD_VALID_RANGE, self).__init__(VarOffset, validrange, PcdDataType)
self.Type = 2
RangeExpr = ""
- data_list = []
i = 0
for item in self.rawdata:
if i == 0:
@@ -294,8 +291,7 @@ class VAR_CHECK_PCD_VALID_RANGE(VAR_CHECK_PCD_VALID_OBJ):
range_result = RangeExpression(RangeExpr, self.PcdDataType)(True)
for rangelist in range_result:
for obj in rangelist.pop():
- data_list.append((obj.start, obj.end))
- self.data = set(data_list)
+ self.data.add((obj.start, obj.end))
self.Length = 5 + len(self.data) * 2 * self.StorageWidth
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 21/42] BaseTools: AutoGen - refactor out a useless class
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (19 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 20/42] BaseTools: AutoGen - refactor out a list Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 22/42] BaseTools: AutoGen - no need to recompute Jaben Carsey
` (21 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
this class was never instantiated. the static function was called.
save the function, remove the rest.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 4 ++--
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 18 +++++++-----------
2 files changed, 9 insertions(+), 13 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 39e3c6896441..f816ccaae311 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -16,7 +16,7 @@ from Common.String import StringToArray
from struct import pack
from ValidCheckingInfoObject import VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER
from ValidCheckingInfoObject import VAR_CHECK_PCD_VARIABLE_TAB
-from ValidCheckingInfoObject import VAR_VALID_OBJECT_FACTORY
+from ValidCheckingInfoObject import GetValidationObject
from Common.VariableAttributes import VariableAttributes
import copy
from struct import unpack
@@ -1247,7 +1247,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
if Platform.Platform.VarCheckFlag:
var_check_obj = VAR_CHECK_PCD_VARIABLE_TAB(VariableGuidStructure, StringToArray(Sku.VariableName))
try:
- var_check_obj.push_back(VAR_VALID_OBJECT_FACTORY.Get_valid_object(Pcd, Sku.VariableOffset))
+ var_check_obj.push_back(GetValidationObject(Pcd, Sku.VariableOffset))
VarAttr, _ = VariableAttributes.GetVarAttributes(Sku.VariableAttribute)
var_check_obj.SetAttributes(VarAttr)
var_check_obj.UpdateSize()
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index b491b68f6e87..92c8fe2df904 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -295,14 +295,10 @@ class VAR_CHECK_PCD_VALID_RANGE(VAR_CHECK_PCD_VALID_OBJ):
self.Length = 5 + len(self.data) * 2 * self.StorageWidth
-class VAR_VALID_OBJECT_FACTORY(object):
- def __init__(self):
- pass
- @staticmethod
- def Get_valid_object(PcdClass, VarOffset):
- if PcdClass.validateranges:
- return VAR_CHECK_PCD_VALID_RANGE(VarOffset, PcdClass.validateranges, PcdClass.DatumType)
- if PcdClass.validlists:
- return VAR_CHECK_PCD_VALID_LIST(VarOffset, PcdClass.validlists, PcdClass.DatumType)
- else:
- return None
+def GetValidationObject(PcdClass, VarOffset):
+ if PcdClass.validateranges:
+ return VAR_CHECK_PCD_VALID_RANGE(VarOffset, PcdClass.validateranges, PcdClass.DatumType)
+ if PcdClass.validlists:
+ return VAR_CHECK_PCD_VALID_LIST(VarOffset, PcdClass.validlists, PcdClass.DatumType)
+ else:
+ return None
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 22/42] BaseTools: AutoGen - no need to recompute
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (20 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 21/42] BaseTools: AutoGen - refactor out a useless class Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 23/42] BaseTools: refactor __init__ functions to not compute temporary variable Jaben Carsey
` (20 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
looping over a list and recomputing the same value has no impact on final value
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 3 +--
1 file changed, 1 insertion(+), 2 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index f816ccaae311..70c7db7ca7c4 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -301,8 +301,7 @@ class DbItemList:
for ItemIndex in xrange(Index):
Offset += len(self.RawDataList[ItemIndex])
else:
- for Datas in self.RawDataList:
- Offset = self.ItemSize * Index
+ Offset = self.ItemSize * Index
return Offset
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 23/42] BaseTools: refactor __init__ functions to not compute temporary variable
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (21 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 22/42] BaseTools: AutoGen - no need to recompute Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 24/42] BaseTools: AutoGen - remove function no one calls Jaben Carsey
` (19 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
just assign correct value to member variable in __init__ or call
parent __init__
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 37 ++++----------------
1 file changed, 7 insertions(+), 30 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 70c7db7ca7c4..4929bcf899ae 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -282,13 +282,9 @@ def toHex(s):
#
class DbItemList:
def __init__(self, ItemSize, DataList=None, RawDataList=None):
- if DataList is None:
- DataList = []
- if RawDataList is None:
- RawDataList = []
self.ItemSize = ItemSize
- self.DataList = DataList
- self.RawDataList = RawDataList
+ self.DataList = DataList if DataList else []
+ self.RawDataList = RawDataList if RawDataList else []
self.ListSize = 0
def GetInterOffset(self, Index):
@@ -357,11 +353,8 @@ class DbItemList:
#
class DbExMapTblItemList (DbItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None):
- if DataList is None:
- DataList = []
- if RawDataList is None:
- RawDataList = []
DbItemList.__init__(self, ItemSize, DataList, RawDataList)
+
def PackData(self):
Buffer = ''
PackStr = "=LHH"
@@ -379,11 +372,8 @@ class DbExMapTblItemList (DbItemList):
#
class DbComItemList (DbItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None):
- if DataList is None:
- DataList = []
- if RawDataList is None:
- RawDataList = []
DbItemList.__init__(self, ItemSize, DataList, RawDataList)
+
def GetInterOffset(self, Index):
Offset = 0
if self.ItemSize == 0:
@@ -443,11 +433,8 @@ class DbComItemList (DbItemList):
#
class DbVariableTableItemList (DbComItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None):
- if DataList is None:
- DataList = []
- if RawDataList is None:
- RawDataList = []
DbComItemList.__init__(self, ItemSize, DataList, RawDataList)
+
def PackData(self):
PackStr = "=LLHHLHH"
Buffer = ''
@@ -465,10 +452,6 @@ class DbVariableTableItemList (DbComItemList):
class DbStringHeadTableItemList(DbItemList):
def __init__(self,ItemSize,DataList=None,RawDataList=None):
- if DataList is None:
- DataList = []
- if RawDataList is None:
- RawDataList = []
DbItemList.__init__(self, ItemSize, DataList, RawDataList)
def GetInterOffset(self, Index):
@@ -511,11 +494,8 @@ class DbStringHeadTableItemList(DbItemList):
#
class DbSkuHeadTableItemList (DbItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None):
- if DataList is None:
- DataList = []
- if RawDataList is None:
- RawDataList = []
DbItemList.__init__(self, ItemSize, DataList, RawDataList)
+
def PackData(self):
PackStr = "=LL"
Buffer = ''
@@ -531,11 +511,8 @@ class DbSkuHeadTableItemList (DbItemList):
#
class DbSizeTableItemList (DbItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None):
- if DataList is None:
- DataList = []
- if RawDataList is None:
- RawDataList = []
DbItemList.__init__(self, ItemSize, DataList, RawDataList)
+
def GetListSize(self):
length = 0
for Data in self.RawDataList:
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 24/42] BaseTools: AutoGen - remove function no one calls
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (22 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 23/42] BaseTools: refactor __init__ functions to not compute temporary variable Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 25/42] BaseTools: AutoGen - move function to clean file namespace Jaben Carsey
` (18 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 11 -----------
1 file changed, 11 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 4929bcf899ae..6398a077e53a 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -261,17 +261,6 @@ def PackGuid(GuidStructureValue):
)
return Buffer
-def toHex(s):
- lst = []
- for ch in s:
- hv = hex(ord(ch)).replace('0x', ' ')
- if len(hv) == 1:
- hv = '0'+hv
- lst.append(hv)
- if lst:
- return reduce(lambda x,y:x+y, lst)
- else:
- return 'empty'
## DbItemList
#
# The class holds the Pcd database items. ItemSize if not zero should match the item datum type in the C structure.
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 25/42] BaseTools: AutoGen - move function to clean file namespace
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (23 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 24/42] BaseTools: AutoGen - remove function no one calls Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 26/42] BaseTools: AutoGen - remove another function no one calls Jaben Carsey
` (17 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel
the function is only used in one other function.
just move it there.
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 52 ++++++++++----------
1 file changed, 26 insertions(+), 26 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 6398a077e53a..a6d2381cfd3f 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -235,32 +235,6 @@ ${PHASE}_PCD_DATABASE_INIT g${PHASE}PcdDbInit = {
#endif
""")
-## PackGuid
-#
-# Pack the GUID value in C structure format into data array
-#
-# @param GuidStructureValue: The GUID value in C structure format
-#
-# @retval Buffer: a data array contains the Guid
-#
-def PackGuid(GuidStructureValue):
- GuidString = GuidStructureStringToGuidString(GuidStructureValue)
- Guid = GuidString.split('-')
- Buffer = pack('=LHHBBBBBBBB',
- int(Guid[0], 16),
- int(Guid[1], 16),
- int(Guid[2], 16),
- int(Guid[3][-4:-2], 16),
- int(Guid[3][-2:], 16),
- int(Guid[4][-12:-10], 16),
- int(Guid[4][-10:-8], 16),
- int(Guid[4][-8:-6], 16),
- int(Guid[4][-6:-4], 16),
- int(Guid[4][-4:-2], 16),
- int(Guid[4][-2:], 16)
- )
- return Buffer
-
## DbItemList
#
# The class holds the Pcd database items. ItemSize if not zero should match the item datum type in the C structure.
@@ -303,6 +277,32 @@ class DbItemList:
return self.ListSize
def PackData(self):
+ ## PackGuid
+ #
+ # Pack the GUID value in C structure format into data array
+ #
+ # @param GuidStructureValue: The GUID value in C structure format
+ #
+ # @retval Buffer: a data array contains the Guid
+ #
+ def PackGuid(GuidStructureValue):
+ GuidString = GuidStructureStringToGuidString(GuidStructureValue)
+ Guid = GuidString.split('-')
+ Buffer = pack('=LHHBBBBBBBB',
+ int(Guid[0], 16),
+ int(Guid[1], 16),
+ int(Guid[2], 16),
+ int(Guid[3][-4:-2], 16),
+ int(Guid[3][-2:], 16),
+ int(Guid[4][-12:-10], 16),
+ int(Guid[4][-10:-8], 16),
+ int(Guid[4][-8:-6], 16),
+ int(Guid[4][-6:-4], 16),
+ int(Guid[4][-4:-2], 16),
+ int(Guid[4][-2:], 16)
+ )
+ return Buffer
+
if self.ItemSize == 8:
PackStr = "=Q"
elif self.ItemSize == 4:
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 26/42] BaseTools: AutoGen - remove another function no one calls
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (24 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 25/42] BaseTools: AutoGen - move function to clean file namespace Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 27/42] BaseTools: Refactor to share GUID packing function Jaben Carsey
` (16 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenVar.py | 5 -----
1 file changed, 5 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 9d226d0f4567..4f894f3f73f3 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -26,11 +26,6 @@ var_info = collections.namedtuple("uefi_var", "pcdindex,pcdname,defaultstoragena
NvStorageHeaderSize = 28
VariableHeaderSize = 32
-def StringArrayToList(StringArray):
- StringArray = StringArray[1:-1]
- StringArray = '[' + StringArray + ']'
- return eval(StringArray)
-
def PackGUID(Guid):
GuidBuffer = pack('=LHHBBBBBBBB',
int(Guid[0], 16),
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 27/42] BaseTools: Refactor to share GUID packing function
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (25 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 26/42] BaseTools: AutoGen - remove another function no one calls Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 28/42] BaseTools: AutoGen - refactor function to remove extra variables Jaben Carsey
` (15 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 17 ++---------------
1 file changed, 2 insertions(+), 15 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index a6d2381cfd3f..9280eeee641c 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -21,6 +21,7 @@ from Common.VariableAttributes import VariableAttributes
import copy
from struct import unpack
from Common.DataType import *
+from GenVar import PackGUID
DATABASE_VERSION = 7
@@ -287,21 +288,7 @@ class DbItemList:
#
def PackGuid(GuidStructureValue):
GuidString = GuidStructureStringToGuidString(GuidStructureValue)
- Guid = GuidString.split('-')
- Buffer = pack('=LHHBBBBBBBB',
- int(Guid[0], 16),
- int(Guid[1], 16),
- int(Guid[2], 16),
- int(Guid[3][-4:-2], 16),
- int(Guid[3][-2:], 16),
- int(Guid[4][-12:-10], 16),
- int(Guid[4][-10:-8], 16),
- int(Guid[4][-8:-6], 16),
- int(Guid[4][-6:-4], 16),
- int(Guid[4][-4:-2], 16),
- int(Guid[4][-2:], 16)
- )
- return Buffer
+ return PackGUID(GuidString.split('-'))
if self.ItemSize == 8:
PackStr = "=Q"
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 28/42] BaseTools: AutoGen - refactor function to remove extra variables
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (26 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 27/42] BaseTools: Refactor to share GUID packing function Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 29/42] BaseTools: AutoGen - refactor more functions only called in __init__ Jaben Carsey
` (14 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
we dont need to keep data we already have in different formats...
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenVar.py | 13 +++++--------
1 file changed, 5 insertions(+), 8 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 4f894f3f73f3..e3595bb62315 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -110,17 +110,14 @@ class VariableMgr(object):
@staticmethod
def assemble_variable(valuedict):
- ordered_offset = sorted(valuedict.keys())
- ordered_value = [valuedict[k] for k in ordered_offset]
+ ordered_valuedict_keys = sorted(valuedict.keys())
var_value = []
- num = 0
- for offset in ordered_offset:
- if offset < len(var_value):
+ for current_valuedict_key in ordered_valuedict_keys:
+ if current_valuedict_key < len(var_value):
raise
- for _ in xrange(offset - len(var_value)):
+ for _ in xrange(current_valuedict_key - len(var_value)):
var_value.append('0x00')
- var_value += ordered_value[num]
- num +=1
+ var_value += valuedict[current_valuedict_key]
return var_value
def process_variable_data(self):
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 29/42] BaseTools: AutoGen - refactor more functions only called in __init__
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (27 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 28/42] BaseTools: AutoGen - refactor function to remove extra variables Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 30/42] BaseTools: remove unused member variable Jaben Carsey
` (13 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/IdfClassObject.py | 89 +++++++++-----------
1 file changed, 40 insertions(+), 49 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index 82396d3744d5..7bc4e4ffb57b 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -66,61 +66,52 @@ EFI_HII_PACKAGE_TYPE_SYSTEM_END = 0xFF
class IdfFileClassObject(object):
def __init__(self, FileList = []):
- self.FileList = FileList
self.ImageFilesDict = {}
self.ImageIDList = []
- if len(self.FileList) > 0:
- self.LoadIdfFiles(FileList)
+ for File in FileList:
+ if File is None:
+ EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'No Image definition file is given.')
+ self.File = File
- def LoadIdfFiles(self, FileList):
- if len(FileList) > 0:
- for File in FileList:
- self.LoadIdfFile(File)
+ try:
+ IdfFile = open(LongFilePath(File.Path), mode='r')
+ FileIn = IdfFile.read()
+ IdfFile.close()
+ except:
+ EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File)
- def LoadIdfFile(self, File = None):
- if File is None:
- EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'No Image definition file is given.')
- self.File = File
+ ImageFileList = []
+ for Line in FileIn.splitlines():
+ Line = Line.strip()
+ Line = StripComments(Line)
+ if len(Line) == 0:
+ continue
- try:
- IdfFile = open(LongFilePath(File.Path), mode='r')
- FileIn = IdfFile.read()
- IdfFile.close()
- except:
- EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File)
+ LineNo = GetLineNo(FileIn, Line, False)
+ if not Line.startswith('#image '):
+ EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The %s in Line %s of File %s is invalid.' % (Line, LineNo, File.Path))
- ImageFileList = []
- for Line in FileIn.splitlines():
- Line = Line.strip()
- Line = StripComments(Line)
- if len(Line) == 0:
- continue
-
- LineNo = GetLineNo(FileIn, Line, False)
- if not Line.startswith('#image '):
- EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The %s in Line %s of File %s is invalid.' % (Line, LineNo, File.Path))
-
- if Line.find('#image ') >= 0:
- LineDetails = Line.split()
- Len = len(LineDetails)
- if Len != 3 and Len != 4:
- EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The format is not match #image IMAGE_ID [TRANSPARENT] ImageFileName in Line %s of File %s.' % (LineNo, File.Path))
- if Len == 4 and LineDetails[2] != 'TRANSPARENT':
- EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'Please use the keyword "TRANSPARENT" to describe the transparency setting in Line %s of File %s.' % (LineNo, File.Path))
- MatchString = gIdentifierPattern.match(LineDetails[1])
- if MatchString is None:
- EdkLogger.error('Image Definition File Parser', FORMAT_INVALID, 'The Image token name %s defined in Idf file %s contains the invalid character.' % (LineDetails[1], File.Path))
- if LineDetails[1] not in self.ImageIDList:
- self.ImageIDList.append(LineDetails[1])
- else:
- EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The %s in Line %s of File %s is already defined.' % (LineDetails[1], LineNo, File.Path))
- if Len == 4:
- ImageFile = ImageFileObject(LineDetails[Len-1], LineDetails[1], True)
- else:
- ImageFile = ImageFileObject(LineDetails[Len-1], LineDetails[1], False)
- ImageFileList.append(ImageFile)
- if ImageFileList:
- self.ImageFilesDict[File] = ImageFileList
+ if Line.find('#image ') >= 0:
+ LineDetails = Line.split()
+ Len = len(LineDetails)
+ if Len != 3 and Len != 4:
+ EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The format is not match #image IMAGE_ID [TRANSPARENT] ImageFileName in Line %s of File %s.' % (LineNo, File.Path))
+ if Len == 4 and LineDetails[2] != 'TRANSPARENT':
+ EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'Please use the keyword "TRANSPARENT" to describe the transparency setting in Line %s of File %s.' % (LineNo, File.Path))
+ MatchString = gIdentifierPattern.match(LineDetails[1])
+ if MatchString is None:
+ EdkLogger.error('Image Definition File Parser', FORMAT_INVALID, 'The Image token name %s defined in Idf file %s contains the invalid character.' % (LineDetails[1], File.Path))
+ if LineDetails[1] not in self.ImageIDList:
+ self.ImageIDList.append(LineDetails[1])
+ else:
+ EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The %s in Line %s of File %s is already defined.' % (LineDetails[1], LineNo, File.Path))
+ if Len == 4:
+ ImageFile = ImageFileObject(LineDetails[Len-1], LineDetails[1], True)
+ else:
+ ImageFile = ImageFileObject(LineDetails[Len-1], LineDetails[1], False)
+ ImageFileList.append(ImageFile)
+ if ImageFileList:
+ self.ImageFilesDict[File] = ImageFileList
def SearchImageID(ImageFileObject, FileList):
if FileList == []:
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 30/42] BaseTools: remove unused member variable
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (28 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 29/42] BaseTools: AutoGen - refactor more functions only called in __init__ Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 31/42] BaseTools: remove redundant content in InfSectionParser Jaben Carsey
` (12 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/IdfClassObject.py | 1 -
1 file changed, 1 deletion(-)
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index 7bc4e4ffb57b..769790d965b5 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -71,7 +71,6 @@ class IdfFileClassObject(object):
for File in FileList:
if File is None:
EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'No Image definition file is given.')
- self.File = File
try:
IdfFile = open(LongFilePath(File.Path), mode='r')
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 31/42] BaseTools: remove redundant content in InfSectionParser
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (29 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 30/42] BaseTools: remove unused member variable Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 32/42] BaseTools: trim whitespace Jaben Carsey
` (11 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/InfSectionParser.py | 9 ++++-----
1 file changed, 4 insertions(+), 5 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/InfSectionParser.py b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
index cf4e76159e81..2cd5a6667a02 100644
--- a/BaseTools/Source/Python/AutoGen/InfSectionParser.py
+++ b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
@@ -26,7 +26,6 @@ class InfSectionParser():
self._ParserInf()
def _ParserInf(self):
- Filename = self._FilePath
FileLinesList = []
UserExtFind = False
FindEnd = True
@@ -35,9 +34,9 @@ class InfSectionParser():
SectionData = []
try:
- FileLinesList = open(Filename, "r", 0).readlines()
+ FileLinesList = open(self._FilePath, "r", 0).readlines()
except BaseException:
- EdkLogger.error("build", AUTOGEN_ERROR, 'File %s is opened failed.' % Filename)
+ EdkLogger.error("build", AUTOGEN_ERROR, 'File %s is opened failed.' % self._FilePath)
for Index in range(0, len(FileLinesList)):
line = str(FileLinesList[Index]).strip()
@@ -49,7 +48,7 @@ class InfSectionParser():
if UserExtFind and FindEnd == False:
if line:
SectionData.append(line)
- if line.lower().startswith(TAB_SECTION_START) and line.lower().endswith(TAB_SECTION_END):
+ if line.startswith(TAB_SECTION_START) and line.endswith(TAB_SECTION_END):
SectionLine = line
UserExtFind = True
FindEnd = False
@@ -59,7 +58,7 @@ class InfSectionParser():
UserExtFind = False
FindEnd = True
self._FileSectionDataList.append({SectionLine: SectionData[:]})
- SectionData = []
+ del SectionData[:]
SectionLine = ''
# Get user extension TianoCore data
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 32/42] BaseTools: trim whitespace
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (30 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 31/42] BaseTools: remove redundant content in InfSectionParser Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 33/42] BaseTools: AutoGen - add Opcode constants Jaben Carsey
` (10 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/AutoGen.py | 190 +--
BaseTools/Source/Python/AutoGen/BuildEngine.py | 2 +-
BaseTools/Source/Python/AutoGen/GenC.py | 74 +-
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 208 +--
BaseTools/Source/Python/AutoGen/InfSectionParser.py | 12 +-
BaseTools/Source/Python/AutoGen/StrGather.py | 26 +-
BaseTools/Source/Python/AutoGen/UniClassObject.py | 18 +-
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 42 +-
BaseTools/Source/Python/BPDG/BPDG.py | 56 +-
BaseTools/Source/Python/BPDG/GenVpd.py | 132 +-
BaseTools/Source/Python/BPDG/StringTable.py | 10 +-
BaseTools/Source/Python/Common/BuildVersion.py | 6 +-
BaseTools/Source/Python/Common/Database.py | 17 +-
BaseTools/Source/Python/Common/MigrationUtilities.py | 64 +-
BaseTools/Source/Python/Common/Misc.py | 50 +-
BaseTools/Source/Python/Common/MultipleWorkspace.py | 17 +-
BaseTools/Source/Python/Common/RangeExpression.py | 126 +-
BaseTools/Source/Python/Common/String.py | 2 +-
BaseTools/Source/Python/Common/ToolDefClassObject.py | 2 +-
BaseTools/Source/Python/Common/VariableAttributes.py | 12 +-
BaseTools/Source/Python/Common/VpdInfoFile.py | 82 +-
BaseTools/Source/Python/CommonDataClass/FdfClass.py | 28 +-
BaseTools/Source/Python/Ecc/CLexer.py | 8 +-
BaseTools/Source/Python/Ecc/CParser.py | 1468 ++++++++++----------
BaseTools/Source/Python/Ecc/Check.py | 22 +-
BaseTools/Source/Python/Ecc/CodeFragment.py | 3 +-
BaseTools/Source/Python/Ecc/CodeFragmentCollector.py | 124 +-
BaseTools/Source/Python/Ecc/Configuration.py | 10 +-
BaseTools/Source/Python/Ecc/Ecc.py | 26 +-
BaseTools/Source/Python/Ecc/Exception.py | 14 +-
BaseTools/Source/Python/Ecc/FileProfile.py | 5 +-
BaseTools/Source/Python/Ecc/MetaDataParser.py | 46 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 100 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py | 88 +-
BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 4 +-
BaseTools/Source/Python/Ecc/Xml/__init__.py | 6 +-
BaseTools/Source/Python/Ecc/c.py | 12 +-
BaseTools/Source/Python/Eot/CLexer.py | 8 +-
BaseTools/Source/Python/Eot/CParser.py | 1468 ++++++++++----------
BaseTools/Source/Python/Eot/Eot.py | 16 +-
BaseTools/Source/Python/Eot/Report.py | 4 +-
BaseTools/Source/Python/GenFds/Capsule.py | 2 +-
BaseTools/Source/Python/GenFds/CapsuleData.py | 18 +-
BaseTools/Source/Python/GenFds/EfiSection.py | 8 +-
BaseTools/Source/Python/GenFds/Fd.py | 2 +-
BaseTools/Source/Python/GenFds/FdfParser.py | 144 +-
BaseTools/Source/Python/GenFds/Ffs.py | 10 +-
BaseTools/Source/Python/GenFds/FfsFileStatement.py | 4 +-
BaseTools/Source/Python/GenFds/FfsInfStatement.py | 62 +-
BaseTools/Source/Python/GenFds/Fv.py | 26 +-
BaseTools/Source/Python/GenFds/GenFds.py | 28 +-
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 36 +-
BaseTools/Source/Python/GenFds/GuidSection.py | 2 +-
BaseTools/Source/Python/GenFds/OptRomFileStatement.py | 6 +-
BaseTools/Source/Python/GenFds/OptRomInfStatement.py | 21 +-
BaseTools/Source/Python/GenFds/OptionRom.py | 49 +-
BaseTools/Source/Python/GenFds/Region.py | 4 +-
BaseTools/Source/Python/GenFds/Section.py | 2 +-
BaseTools/Source/Python/GenFds/Vtf.py | 18 +-
BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 26 +-
BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py | 6 +-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 34 +-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 30 +-
BaseTools/Source/Python/Table/Table.py | 20 +-
BaseTools/Source/Python/Table/TableDataModel.py | 14 +-
BaseTools/Source/Python/Table/TableDec.py | 12 +-
BaseTools/Source/Python/Table/TableDsc.py | 12 +-
BaseTools/Source/Python/Table/TableEotReport.py | 6 +-
BaseTools/Source/Python/Table/TableFdf.py | 12 +-
BaseTools/Source/Python/Table/TableFile.py | 12 +-
BaseTools/Source/Python/Table/TableFunction.py | 8 +-
BaseTools/Source/Python/Table/TableIdentifier.py | 4 +-
BaseTools/Source/Python/Table/TableInf.py | 12 +-
BaseTools/Source/Python/Table/TablePcd.py | 4 +-
BaseTools/Source/Python/Table/TableReport.py | 6 +-
BaseTools/Source/Python/TargetTool/TargetTool.py | 24 +-
BaseTools/Source/Python/Trim/Trim.py | 20 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 2 +-
BaseTools/Source/Python/Workspace/MetaFileParser.py | 2 +-
BaseTools/Source/Python/Workspace/MetaFileTable.py | 88 +-
BaseTools/Source/Python/Workspace/WorkspaceDatabase.py | 24 +-
BaseTools/Source/Python/build/BuildReport.py | 34 +-
BaseTools/Source/Python/build/build.py | 14 +-
BaseTools/Source/Python/sitecustomize.py | 2 +-
84 files changed, 2736 insertions(+), 2742 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index dc82075c5876..39d5932a9a66 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -236,7 +236,7 @@ class WorkspaceAutoGen(AutoGen):
super(WorkspaceAutoGen, self).__init__(Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs)
self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs)
self._Init = True
-
+
## Initialize WorkspaceAutoGen
#
# @param WorkspaceDir Root directory of workspace
@@ -305,7 +305,7 @@ class WorkspaceAutoGen(AutoGen):
ExtraData="Build target [%s] is not supported by the platform. [Valid target: %s]"
% (self.BuildTarget, " ".join(self.Platform.BuildTargets)))
-
+
# parse FDF file to get PCDs in it, if any
if not self.FdfFile:
self.FdfFile = self.Platform.FlashDefinition
@@ -743,7 +743,7 @@ class WorkspaceAutoGen(AutoGen):
## _CheckDuplicateInFV() method
#
- # Check whether there is duplicate modules/files exist in FV section.
+ # Check whether there is duplicate modules/files exist in FV section.
# The check base on the file GUID;
#
def _CheckDuplicateInFV(self, Fdf):
@@ -774,7 +774,7 @@ class WorkspaceAutoGen(AutoGen):
Module.Guid.upper()),
ExtraData=self.FdfFile)
#
- # Some INF files not have entity in DSC file.
+ # Some INF files not have entity in DSC file.
#
if not InfFoundFlag:
if FfsFile.InfFileName.find('$') == -1:
@@ -784,7 +784,7 @@ class WorkspaceAutoGen(AutoGen):
PathClassObj = PathClass(FfsFile.InfFileName, self.WorkspaceDir)
#
- # Here we just need to get FILE_GUID from INF file, use 'COMMON' as ARCH attribute. and use
+ # Here we just need to get FILE_GUID from INF file, use 'COMMON' as ARCH attribute. and use
# BuildObject from one of AutoGenObjectList is enough.
#
InfObj = self.AutoGenObjectList[0].BuildDatabase.WorkspaceDb.BuildObject[PathClassObj, TAB_ARCH_COMMON, self.BuildTarget, self.ToolChain]
@@ -803,7 +803,7 @@ class WorkspaceAutoGen(AutoGen):
if FfsFile.NameGuid is not None:
#
- # If the NameGuid reference a PCD name.
+ # If the NameGuid reference a PCD name.
# The style must match: PCD(xxxx.yyy)
#
if gPCDAsGuidPattern.match(FfsFile.NameGuid):
@@ -880,7 +880,7 @@ class WorkspaceAutoGen(AutoGen):
for Pcd in Pa.Platform.Pcds:
PcdType = Pa.Platform.Pcds[Pcd].Type
- # If no PCD type, this PCD comes from FDF
+ # If no PCD type, this PCD comes from FDF
if not PcdType:
continue
@@ -972,14 +972,14 @@ class WorkspaceAutoGen(AutoGen):
## Check the PCDs token value conflict in each DEC file.
#
# Will cause build break and raise error message while two PCDs conflict.
- #
+ #
# @return None
#
def _CheckAllPcdsTokenValueConflict(self):
for Pa in self.AutoGenObjectList:
for Package in Pa.PackageList:
PcdList = Package.Pcds.values()
- PcdList.sort(lambda x, y: cmp(int(x.TokenValue, 0), int(y.TokenValue, 0)))
+ PcdList.sort(lambda x, y: cmp(int(x.TokenValue, 0), int(y.TokenValue, 0)))
Count = 0
while (Count < len(PcdList) - 1) :
Item = PcdList[Count]
@@ -1103,20 +1103,20 @@ class PlatformAutoGen(AutoGen):
self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch)
self._Init = True
#
- # Used to store all PCDs for both PEI and DXE phase, in order to generate
+ # Used to store all PCDs for both PEI and DXE phase, in order to generate
# correct PCD database
- #
+ #
_DynaPcdList_ = []
_NonDynaPcdList_ = []
_PlatformPcds = {}
-
+
#
- # The priority list while override build option
+ # The priority list while override build option
#
PrioList = {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE (Highest)
"0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
"0x10111" : 14, # TARGET_*********_ARCH_COMMANDTYPE_ATTRIBUTE
- "0x00111" : 13, # ******_*********_ARCH_COMMANDTYPE_ATTRIBUTE
+ "0x00111" : 13, # ******_*********_ARCH_COMMANDTYPE_ATTRIBUTE
"0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
"0x01011" : 11, # ******_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
"0x10011" : 10, # TARGET_*********_****_COMMANDTYPE_ATTRIBUTE
@@ -1268,17 +1268,17 @@ class PlatformAutoGen(AutoGen):
#
def CollectFixedAtBuildPcds(self):
for LibAuto in self.LibraryAutoGenList:
- FixedAtBuildPcds = {}
- ShareFixedAtBuildPcdsSameValue = {}
- for Module in LibAuto._ReferenceModules:
+ FixedAtBuildPcds = {}
+ ShareFixedAtBuildPcdsSameValue = {}
+ for Module in LibAuto._ReferenceModules:
for Pcd in Module.FixedAtBuildPcds + LibAuto.FixedAtBuildPcds:
- key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
+ key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
if key not in FixedAtBuildPcds:
ShareFixedAtBuildPcdsSameValue[key] = True
FixedAtBuildPcds[key] = Pcd.DefaultValue
else:
if FixedAtBuildPcds[key] != Pcd.DefaultValue:
- ShareFixedAtBuildPcdsSameValue[key] = False
+ ShareFixedAtBuildPcdsSameValue[key] = False
for Pcd in LibAuto.FixedAtBuildPcds:
key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
if (Pcd.TokenCName,Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
@@ -1287,7 +1287,7 @@ class PlatformAutoGen(AutoGen):
DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName)]
if DscPcd.Type != TAB_PCDS_FIXED_AT_BUILD:
continue
- if key in ShareFixedAtBuildPcdsSameValue and ShareFixedAtBuildPcdsSameValue[key]:
+ if key in ShareFixedAtBuildPcdsSameValue and ShareFixedAtBuildPcdsSameValue[key]:
LibAuto.ConstPcd[key] = FixedAtBuildPcds[key]
def CollectVariables(self, DynamicPcdSet):
@@ -1388,7 +1388,7 @@ class PlatformAutoGen(AutoGen):
for F in self.Platform.Modules.keys():
M = ModuleAutoGen(self.Workspace, F, self.BuildTarget, self.ToolChain, self.Arch, self.MetaFile)
#GuidValue.update(M.Guids)
-
+
self.Platform.Modules[F].M = M
for PcdFromModule in M.ModulePcdList + M.LibraryPcdList:
@@ -1400,27 +1400,27 @@ class PlatformAutoGen(AutoGen):
if M.IsBinaryModule == True:
PcdFromModule.IsFromBinaryInf = True
- # Check the PCD from DSC or not
+ # Check the PCD from DSC or not
PcdFromModule.IsFromDsc = (PcdFromModule.TokenCName, PcdFromModule.TokenSpaceGuidCName) in self.Platform.Pcds
if PcdFromModule.Type in GenC.gDynamicPcd or PcdFromModule.Type in GenC.gDynamicExPcd:
if F.Path not in FdfModuleList:
- # If one of the Source built modules listed in the DSC is not listed
- # in FDF modules, and the INF lists a PCD can only use the PcdsDynamic
- # access method (it is only listed in the DEC file that declares the
+ # If one of the Source built modules listed in the DSC is not listed
+ # in FDF modules, and the INF lists a PCD can only use the PcdsDynamic
+ # access method (it is only listed in the DEC file that declares the
# PCD as PcdsDynamic), then build tool will report warning message
- # notify the PI that they are attempting to build a module that must
- # be included in a flash image in order to be functional. These Dynamic
- # PCD will not be added into the Database unless it is used by other
+ # notify the PI that they are attempting to build a module that must
+ # be included in a flash image in order to be functional. These Dynamic
+ # PCD will not be added into the Database unless it is used by other
# modules that are included in the FDF file.
if PcdFromModule.Type in GenC.gDynamicPcd and \
PcdFromModule.IsFromBinaryInf == False:
# Print warning message to let the developer make a determine.
continue
- # If one of the Source built modules listed in the DSC is not listed in
- # FDF modules, and the INF lists a PCD can only use the PcdsDynamicEx
- # access method (it is only listed in the DEC file that declares the
- # PCD as PcdsDynamicEx), then DO NOT break the build; DO NOT add the
+ # If one of the Source built modules listed in the DSC is not listed in
+ # FDF modules, and the INF lists a PCD can only use the PcdsDynamicEx
+ # access method (it is only listed in the DEC file that declares the
+ # PCD as PcdsDynamicEx), then DO NOT break the build; DO NOT add the
# PCD to the Platform's PCD Database.
if PcdFromModule.Type in GenC.gDynamicExPcd:
continue
@@ -1448,14 +1448,14 @@ class PlatformAutoGen(AutoGen):
PcdFromModule.Pending = False
self._NonDynaPcdList_.append (PcdFromModule)
DscModuleSet = {os.path.normpath(ModuleInf.Path) for ModuleInf in self.Platform.Modules}
- # add the PCD from modules that listed in FDF but not in DSC to Database
+ # add the PCD from modules that listed in FDF but not in DSC to Database
for InfName in FdfModuleList:
if InfName not in DscModuleSet:
InfClass = PathClass(InfName)
M = self.BuildDatabase[InfClass, self.Arch, self.BuildTarget, self.ToolChain]
- # If a module INF in FDF but not in current arch's DSC module list, it must be module (either binary or source)
- # for different Arch. PCDs in source module for different Arch is already added before, so skip the source module here.
- # For binary module, if in current arch, we need to list the PCDs into database.
+ # If a module INF in FDF but not in current arch's DSC module list, it must be module (either binary or source)
+ # for different Arch. PCDs in source module for different Arch is already added before, so skip the source module here.
+ # For binary module, if in current arch, we need to list the PCDs into database.
if not M.IsSupportedArch:
continue
# Override the module PCD setting by platform setting
@@ -1480,20 +1480,20 @@ class PlatformAutoGen(AutoGen):
self._NonDynaPcdList_.append(PcdFromModule)
if PcdFromModule in self._DynaPcdList_ and PcdFromModule.Phase == 'PEI' and PcdFromModule.Type in GenC.gDynamicExPcd:
# Overwrite the phase of any the same PCD existing, if Phase is PEI.
- # It is to solve the case that a dynamic PCD used by a PEM module/PEI
+ # It is to solve the case that a dynamic PCD used by a PEM module/PEI
# module & DXE module at a same time.
# Overwrite the type of the PCDs in source INF by the type of AsBuild
- # INF file as DynamicEx.
+ # INF file as DynamicEx.
Index = self._DynaPcdList_.index(PcdFromModule)
self._DynaPcdList_[Index].Phase = PcdFromModule.Phase
self._DynaPcdList_[Index].Type = PcdFromModule.Type
for PcdFromModule in self._NonDynaPcdList_:
- # If a PCD is not listed in the DSC file, but binary INF files used by
- # this platform all (that use this PCD) list the PCD in a [PatchPcds]
- # section, AND all source INF files used by this platform the build
- # that use the PCD list the PCD in either a [Pcds] or [PatchPcds]
+ # If a PCD is not listed in the DSC file, but binary INF files used by
+ # this platform all (that use this PCD) list the PCD in a [PatchPcds]
+ # section, AND all source INF files used by this platform the build
+ # that use the PCD list the PCD in either a [Pcds] or [PatchPcds]
# section, then the tools must NOT add the PCD to the Platform's PCD
- # Database; the build must assign the access method for this PCD as
+ # Database; the build must assign the access method for this PCD as
# PcdsPatchableInModule.
if PcdFromModule not in self._DynaPcdList_:
continue
@@ -1516,7 +1516,7 @@ class PlatformAutoGen(AutoGen):
self._DynamicPcdList = self._DynaPcdList_
#
# Sort dynamic PCD list to:
- # 1) If PCD's datum type is VOID* and value is unicode string which starts with L, the PCD item should
+ # 1) If PCD's datum type is VOID* and value is unicode string which starts with L, the PCD item should
# try to be put header of dynamicd List
# 2) If PCD is HII type, the PCD item should be put after unicode type PCD
#
@@ -1537,7 +1537,7 @@ class PlatformAutoGen(AutoGen):
if self._PlatformPcds[item].DatumType and self._PlatformPcds[item].DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
self._PlatformPcds[item].DatumType = TAB_VOID
- if (self.Workspace.ArchList[-1] == self.Arch):
+ if (self.Workspace.ArchList[-1] == self.Arch):
for Pcd in self._DynamicPcdList:
# just pick the a value to determine whether is unicode string type
Sku = Pcd.SkuInfoList.values()[0]
@@ -1621,7 +1621,7 @@ class PlatformAutoGen(AutoGen):
#
# Fix the PCDs define in VPD PCD section that never referenced by module.
# An example is PCD for signature usage.
- #
+ #
for DscPcd in PlatformPcds:
DscPcdEntry = self._PlatformPcds[DscPcd]
if DscPcdEntry.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
@@ -1643,8 +1643,8 @@ class PlatformAutoGen(AutoGen):
defaultindex = SkuObjList.index((TAB_DEFAULT,DefaultSku))
SkuObjList[0],SkuObjList[defaultindex] = SkuObjList[defaultindex],SkuObjList[0]
for (SkuName,Sku) in SkuObjList:
- Sku.VpdOffset = Sku.VpdOffset.strip()
-
+ Sku.VpdOffset = Sku.VpdOffset.strip()
+
# Need to iterate DEC pcd information to get the value & datumtype
for eachDec in self.PackageList:
for DecPcd in eachDec.Pcds:
@@ -1655,8 +1655,8 @@ class PlatformAutoGen(AutoGen):
EdkLogger.warn("build", "Unreferenced vpd pcd used!",
File=self.MetaFile, \
ExtraData = "PCD: %s.%s used in the DSC file %s is unreferenced." \
- %(DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Path))
-
+ %(DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Path))
+
DscPcdEntry.DatumType = DecPcdEntry.DatumType
DscPcdEntry.DefaultValue = DecPcdEntry.DefaultValue
DscPcdEntry.TokenValue = DecPcdEntry.TokenValue
@@ -1664,7 +1664,7 @@ class PlatformAutoGen(AutoGen):
# Only fix the value while no value provided in DSC file.
if not Sku.DefaultValue:
DscPcdEntry.SkuInfoList[DscPcdEntry.SkuInfoList.keys()[0]].DefaultValue = DecPcdEntry.DefaultValue
-
+
if DscPcdEntry not in self._DynamicPcdList:
self._DynamicPcdList.append(DscPcdEntry)
Sku.VpdOffset = Sku.VpdOffset.strip()
@@ -1695,7 +1695,7 @@ class PlatformAutoGen(AutoGen):
VpdFile.Add(DscPcdEntry, SkuName,Sku.VpdOffset)
SkuValueMap[PcdValue].append(Sku)
if not NeedProcessVpdMapFile and Sku.VpdOffset == "*":
- NeedProcessVpdMapFile = True
+ NeedProcessVpdMapFile = True
if DscPcdEntry.DatumType == TAB_VOID and PcdValue.startswith("L"):
UnicodePcdArray.add(DscPcdEntry)
elif len(Sku.VariableName) > 0:
@@ -1707,7 +1707,7 @@ class PlatformAutoGen(AutoGen):
VpdSkuMap[DscPcd] = SkuValueMap
if (self.Platform.FlashDefinition is None or self.Platform.FlashDefinition == '') and \
VpdFile.GetCount() != 0:
- EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
+ EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
"Fail to get FLASH_DEFINITION definition in DSC file %s which is required when DSC contains VPD PCD." % str(self.Platform.MetaFile))
if VpdFile.GetCount() != 0:
@@ -2055,9 +2055,9 @@ class PlatformAutoGen(AutoGen):
self._PcdTokenNumber = OrderedDict()
TokenNumber = 1
#
- # Make the Dynamic and DynamicEx PCD use within different TokenNumber area.
+ # Make the Dynamic and DynamicEx PCD use within different TokenNumber area.
# Such as:
- #
+ #
# Dynamic PCD:
# TokenNumber 0 ~ 10
# DynamicEx PCD:
@@ -2522,7 +2522,7 @@ class PlatformAutoGen(AutoGen):
# @param Options Options to be expanded
#
# @retval options Options expanded
- #
+ #
def _ExpandBuildOption(self, Options, ModuleStyle=None):
BuildOptions = {}
FamilyMatch = False
@@ -2547,9 +2547,9 @@ class PlatformAutoGen(AutoGen):
if OverrideList.get(Key[1]) is not None:
OverrideList.pop(Key[1])
OverrideList[Key[1]] = Options[Key]
-
+
#
- # Use the highest priority value.
+ # Use the highest priority value.
#
if (len(OverrideList) >= 2):
KeyList = OverrideList.keys()
@@ -2560,7 +2560,7 @@ class PlatformAutoGen(AutoGen):
NextKey = KeyList[Index1 + Index + 1]
#
# Compare two Key, if one is included by another, choose the higher priority one
- #
+ #
Target2, ToolChain2, Arch2, CommandType2, Attr2 = NextKey.split("_")
if Target1 == Target2 or Target1 == "*" or Target2 == "*":
if ToolChain1 == ToolChain2 or ToolChain1 == "*" or ToolChain2 == "*":
@@ -2573,7 +2573,7 @@ class PlatformAutoGen(AutoGen):
else:
if Options.get((self.BuildRuleFamily, NowKey)) is not None:
Options.pop((self.BuildRuleFamily, NowKey))
-
+
for Key in Options:
if ModuleStyle is not None and len (Key) > 2:
# Check Module style is EDK or EDKII.
@@ -2762,7 +2762,7 @@ class ModuleAutoGen(AutoGen):
% (MetaFile, Arch))
return None
return obj
-
+
## Initialize ModuleAutoGen
#
# @param Workspace EdkIIWorkspaceBuild object
@@ -2861,13 +2861,13 @@ class ModuleAutoGen(AutoGen):
self.AutoGenDepSet = set()
-
+
## The Modules referenced to this Library
# Only Library has this attribute
- self._ReferenceModules = []
-
+ self._ReferenceModules = []
+
## Store the FixedAtBuild Pcds
- #
+ #
self._FixedAtBuildPcds = []
self.ConstPcd = {}
return True
@@ -2884,8 +2884,8 @@ class ModuleAutoGen(AutoGen):
continue
if Pcd not in self._FixedAtBuildPcds:
self._FixedAtBuildPcds.append(Pcd)
-
- return self._FixedAtBuildPcds
+
+ return self._FixedAtBuildPcds
def _GetUniqueBaseName(self):
BaseName = self.Name
@@ -3087,7 +3087,7 @@ class ModuleAutoGen(AutoGen):
continue
PackageList.append(Package)
return PackageList
-
+
## Get the depex string
#
# @return : a string contain all depex expresion.
@@ -3116,7 +3116,7 @@ class ModuleAutoGen(AutoGen):
(Arch.upper() == self.Arch.upper() and \
ModuleType.upper() in [TAB_ARCH_COMMON, self.ModuleType.upper()]):
DepexList.append({(Arch, ModuleType): DepexExpr})
-
+
#the type of build module is USER_DEFINED.
if self.ModuleType.upper() == SUP_MODULE_USER_DEFINED:
for Depex in DepexList:
@@ -3127,7 +3127,7 @@ class ModuleAutoGen(AutoGen):
if not DepexStr:
return '[Depex.%s]\n' % self.Arch
return DepexStr
-
+
#the type of build module not is USER_DEFINED.
Count = 0
for Depex in DepexList:
@@ -3147,7 +3147,7 @@ class ModuleAutoGen(AutoGen):
if not DepexStr:
return '[Depex.%s]\n' % self.Arch
return '[Depex.%s]\n# ' % self.Arch + DepexStr
-
+
## Merge dependency expression
#
# @retval list The token list of the dependency expression after parsed
@@ -3283,7 +3283,7 @@ class ModuleAutoGen(AutoGen):
#
self._BuildOptionIncPathList = []
return self._BuildOptionIncPathList
-
+
BuildOptionIncPathList = []
for Tool in ('CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'ASM'):
Attr = 'FLAGS'
@@ -3291,7 +3291,7 @@ class ModuleAutoGen(AutoGen):
FlagOption = self.BuildOption[Tool][Attr]
except KeyError:
FlagOption = ''
-
+
if self.PlatformInfo.ToolChainFamily != 'RVCT':
IncPathList = [NormPath(Path, self.Macros) for Path in BuildOptIncludeRegEx.findall(FlagOption)]
else:
@@ -3304,7 +3304,7 @@ class ModuleAutoGen(AutoGen):
IncPathList += [NormPath(PathEntry, self.Macros) for PathEntry in PathList]
#
- # EDK II modules must not reference header files outside of the packages they depend on or
+ # EDK II modules must not reference header files outside of the packages they depend on or
# within the module's directory tree. Report error if violation.
#
if self.AutoGenVersion >= 0x00010005 and len(IncPathList) > 0:
@@ -3316,13 +3316,13 @@ class ModuleAutoGen(AutoGen):
ExtraData=ErrMsg,
File=str(self.MetaFile))
-
+
BuildOptionIncPathList += IncPathList
-
+
self._BuildOptionIncPathList = BuildOptionIncPathList
-
+
return self._BuildOptionIncPathList
-
+
## Return a list of files which can be built from source
#
# What kind of files can be built is determined by build rules in
@@ -3374,7 +3374,7 @@ class ModuleAutoGen(AutoGen):
Order_Dict[F].sort(key=lambda i: self.BuildRuleOrder.index(i))
for Ext in Order_Dict[F][1:]:
RemoveList.append(F + Ext)
-
+
for item in RemoveList:
FileList.remove(item)
@@ -3826,12 +3826,12 @@ class ModuleAutoGen(AutoGen):
for SourceFile in self.Module.Sources:
if SourceFile.Type.upper() == ".VFR" :
#
- # search the .map file to find the offset of vfr binary in the PE32+/TE file.
+ # search the .map file to find the offset of vfr binary in the PE32+/TE file.
#
VfrUniBaseName[SourceFile.BaseName] = (SourceFile.BaseName + "Bin")
if SourceFile.Type.upper() == ".UNI" :
#
- # search the .map file to find the offset of Uni strings binary in the PE32+/TE file.
+ # search the .map file to find the offset of Uni strings binary in the PE32+/TE file.
#
VfrUniBaseName["UniOffsetName"] = (self.Name + "Strings")
@@ -3852,7 +3852,7 @@ class ModuleAutoGen(AutoGen):
EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName,None)
# Use a instance of StringIO to cache data
- fStringIO = StringIO('')
+ fStringIO = StringIO('')
for Item in VfrUniOffsetList:
if (Item[0].find("Strings") != -1):
@@ -3863,7 +3863,7 @@ class ModuleAutoGen(AutoGen):
#
UniGuid = [0xe0, 0xc5, 0x13, 0x89, 0xf6, 0x33, 0x86, 0x4d, 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66]
UniGuid = [chr(ItemGuid) for ItemGuid in UniGuid]
- fStringIO.write(''.join(UniGuid))
+ fStringIO.write(''.join(UniGuid))
UniValue = pack ('Q', int (Item[1], 16))
fStringIO.write (UniValue)
else:
@@ -3874,13 +3874,13 @@ class ModuleAutoGen(AutoGen):
#
VfrGuid = [0xb4, 0x7c, 0xbc, 0xd0, 0x47, 0x6a, 0x5f, 0x49, 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2]
VfrGuid = [chr(ItemGuid) for ItemGuid in VfrGuid]
- fStringIO.write(''.join(VfrGuid))
+ fStringIO.write(''.join(VfrGuid))
VfrValue = pack ('Q', int (Item[1], 16))
fStringIO.write (VfrValue)
#
# write data into file.
#
- try :
+ try :
fInputfile.write (fStringIO.getvalue())
except:
EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
@@ -3901,15 +3901,15 @@ class ModuleAutoGen(AutoGen):
if self.IsAsBuiltInfCreated:
return
-
+
# Skip the following code for EDK I inf
if self.AutoGenVersion < 0x00010005:
return
-
+
# Skip the following code for libraries
if self.IsLibrary:
return
-
+
# Skip the following code for modules with no source files
if not self.SourceFileList:
return
@@ -3917,7 +3917,7 @@ class ModuleAutoGen(AutoGen):
# Skip the following code for modules without any binary files
if self.BinaryFileList:
return
-
+
### TODO: How to handles mixed source and binary modules
# Find all DynamicEx and PatchableInModule PCDs used by this module and dependent libraries
@@ -4211,7 +4211,7 @@ class ModuleAutoGen(AutoGen):
UsageIndex = Index
break
if UsageIndex != -1:
- PcdCommentList[UsageIndex] = '## %s %s %s' % (UsageStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, ''))
+ PcdCommentList[UsageIndex] = '## %s %s %s' % (UsageStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, ''))
else:
PcdCommentList.append('## UNDEFINED ' + HiiInfo)
PcdComments = '\n '.join(PcdCommentList)
@@ -4226,7 +4226,7 @@ class ModuleAutoGen(AutoGen):
# Generated LibraryClasses section in comments.
for Library in self.LibraryAutoGenList:
AsBuiltInfDict['libraryclasses_item'] += [Library.MetaFile.File.replace('\\', '/')]
-
+
# Generated UserExtensions TianoCore section.
# All tianocore user extensions are copied.
UserExtStr = ''
@@ -4242,12 +4242,12 @@ class ModuleAutoGen(AutoGen):
DepexExpresion = self._GetDepexExpresionString()
if DepexExpresion:
AsBuiltInfDict['depexsection_item'] = DepexExpresion
-
+
AsBuiltInf = TemplateString()
AsBuiltInf.Append(gAsBuiltInfHeaderString.Replace(AsBuiltInfDict))
-
+
SaveFileOnChange(os.path.join(self.OutputDir, self.Name + '.inf'), str(AsBuiltInf), False)
-
+
self.IsAsBuiltInfCreated = True
if GlobalData.gBinCacheDest:
self.CopyModuleToCache()
@@ -4570,7 +4570,7 @@ class ModuleAutoGen(AutoGen):
BuildOption = property(_GetModuleBuildOption)
BuildOptionIncPathList = property(_GetBuildOptionIncPathList)
BuildCommand = property(_GetBuildCommand)
-
+
FixedAtBuildPcds = property(_GetFixedAtBuildPcds)
# This acts like the main() function for the script, unless it is 'import'ed into another script.
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index 2c823797d7c5..dc803b094300 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -359,7 +359,7 @@ class BuildRule:
# Clean up the line and replace path separator with native one
Line = self.RuleContent[Index].strip().replace(self._PATH_SEP, os.path.sep)
self.RuleContent[Index] = Line
-
+
# find the build_rule_version
if Line and Line[0] == "#" and Line.find(TAB_BUILD_RULE_VERSION) <> -1:
if Line.find("=") <> -1 and Line.find("=") < (len(Line) - 1) and (Line[(Line.find("=") + 1):]).split():
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 4e7e3d90be64..11c88803f201 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -789,7 +789,7 @@ gModuleTypeHeaderFile = {
"USER_DEFINED" : [gBasicHeaderFile]
}
-## Autogen internal worker macro to define DynamicEx PCD name includes both the TokenSpaceGuidName
+## Autogen internal worker macro to define DynamicEx PCD name includes both the TokenSpaceGuidName
# the TokenName and Guid comparison to avoid define name collisions.
#
# @param Info The ModuleAutoGen object
@@ -809,7 +809,7 @@ def DynExPcdTokenNumberMapping(Info, AutoGenH):
return
AutoGenH.Append('\n#define COMPAREGUID(Guid1, Guid2) (BOOLEAN)(*(CONST UINT64*)Guid1 == *(CONST UINT64*)Guid2 && *((CONST UINT64*)Guid1 + 1) == *((CONST UINT64*)Guid2 + 1))\n')
# AutoGen for each PCD listed in a [PcdEx] section of a Module/Lib INF file.
- # Auto generate a macro for each TokenName that takes a Guid pointer as a parameter.
+ # Auto generate a macro for each TokenName that takes a Guid pointer as a parameter.
# Use the Guid pointer to see if it matches any of the token space GUIDs.
TokenCNameList = set()
for TokenCName in ExTokenCNameList:
@@ -827,15 +827,15 @@ def DynExPcdTokenNumberMapping(Info, AutoGenH):
Index = Index + 1
if Index == 1:
AutoGenH.Append('\n#define __PCD_%s_ADDR_CMP(GuidPtr) (' % (RealTokenCName))
- AutoGenH.Append('\\\n (GuidPtr == &%s) ? _PCD_TOKEN_%s_%s:'
+ AutoGenH.Append('\\\n (GuidPtr == &%s) ? _PCD_TOKEN_%s_%s:'
% (Pcd.TokenSpaceGuidCName, Pcd.TokenSpaceGuidCName, RealTokenCName))
else:
- AutoGenH.Append('\\\n (GuidPtr == &%s) ? _PCD_TOKEN_%s_%s:'
+ AutoGenH.Append('\\\n (GuidPtr == &%s) ? _PCD_TOKEN_%s_%s:'
% (Pcd.TokenSpaceGuidCName, Pcd.TokenSpaceGuidCName, RealTokenCName))
if Index == Count:
AutoGenH.Append('0 \\\n )\n')
TokenCNameList.add(TokenCName)
-
+
TokenCNameList = set()
for TokenCName in ExTokenCNameList:
if TokenCName in TokenCNameList:
@@ -853,14 +853,14 @@ def DynExPcdTokenNumberMapping(Info, AutoGenH):
if Index == 1:
AutoGenH.Append('\n#define __PCD_%s_VAL_CMP(GuidPtr) (' % (RealTokenCName))
AutoGenH.Append('\\\n (GuidPtr == NULL) ? 0:')
- AutoGenH.Append('\\\n COMPAREGUID (GuidPtr, &%s) ? _PCD_TOKEN_%s_%s:'
+ AutoGenH.Append('\\\n COMPAREGUID (GuidPtr, &%s) ? _PCD_TOKEN_%s_%s:'
% (Pcd.TokenSpaceGuidCName, Pcd.TokenSpaceGuidCName, RealTokenCName))
else:
- AutoGenH.Append('\\\n COMPAREGUID (GuidPtr, &%s) ? _PCD_TOKEN_%s_%s:'
+ AutoGenH.Append('\\\n COMPAREGUID (GuidPtr, &%s) ? _PCD_TOKEN_%s_%s:'
% (Pcd.TokenSpaceGuidCName, Pcd.TokenSpaceGuidCName, RealTokenCName))
if Index == Count:
AutoGenH.Append('0 \\\n )\n')
- # Autogen internal worker macro to compare GUIDs. Guid1 is a pointer to a GUID.
+ # Autogen internal worker macro to compare GUIDs. Guid1 is a pointer to a GUID.
# Guid2 is a C name for a GUID. Compare pointers first because optimizing compiler
# can do this at build time on CONST GUID pointers and optimize away call to COMPAREGUID().
# COMPAREGUID() will only be used if the Guid passed in is local to the module.
@@ -895,22 +895,22 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
if Pcd.PcdValueFromComm:
Pcd.DefaultValue = Pcd.PcdValueFromComm
-
+
if Pcd.Type in gDynamicExPcd:
TokenNumber = int(Pcd.TokenValue, 0)
- # Add TokenSpaceGuidValue value to PcdTokenName to discriminate the DynamicEx PCDs with
+ # Add TokenSpaceGuidValue value to PcdTokenName to discriminate the DynamicEx PCDs with
# different Guids but same TokenCName
PcdExTokenName = '_PCD_TOKEN_' + Pcd.TokenSpaceGuidCName + '_' + TokenCName
AutoGenH.Append('\n#define %s %dU\n' % (PcdExTokenName, TokenNumber))
else:
if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in PcdTokenNumber:
- # If one of the Source built modules listed in the DSC is not listed in FDF modules,
- # and the INF lists a PCD can only use the PcdsDynamic access method (it is only
- # listed in the DEC file that declares the PCD as PcdsDynamic), then build tool will
- # report warning message notify the PI that they are attempting to build a module
- # that must be included in a flash image in order to be functional. These Dynamic PCD
- # will not be added into the Database unless it is used by other modules that are
- # included in the FDF file.
+ # If one of the Source built modules listed in the DSC is not listed in FDF modules,
+ # and the INF lists a PCD can only use the PcdsDynamic access method (it is only
+ # listed in the DEC file that declares the PCD as PcdsDynamic), then build tool will
+ # report warning message notify the PI that they are attempting to build a module
+ # that must be included in a flash image in order to be functional. These Dynamic PCD
+ # will not be added into the Database unless it is used by other modules that are
+ # included in the FDF file.
# In this case, just assign an invalid token number to make it pass build.
if Pcd.Type in PCD_DYNAMIC_TYPE_LIST:
TokenNumber = 0
@@ -934,7 +934,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
SetModeName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_' + TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[TAB_VOID] + '_' + TokenCName
SetModeStatusName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_S_' + TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[TAB_VOID] + '_S_' + TokenCName
GetModeSizeName = '_PCD_GET_MODE_SIZE' + '_' + TokenCName
-
+
if Pcd.Type in gDynamicExPcd:
if Info.IsLibrary:
PcdList = Info.LibraryPcdList
@@ -1049,7 +1049,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
"Too large PCD value for datum type [%s] of PCD %s.%s" % (Pcd.DatumType, Pcd.TokenSpaceGuidCName, TokenCName),
ExtraData="[%s]" % str(Info))
if not Value.endswith('U'):
- Value += 'U'
+ Value += 'U'
elif Pcd.DatumType == TAB_UINT8:
if ValueNumber < 0:
EdkLogger.error("build", AUTOGEN_ERROR,
@@ -1116,7 +1116,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
PcdValueName = '_PCD_PATCHABLE_VALUE_' + TokenCName
else:
PcdValueName = '_PCD_VALUE_' + TokenCName
-
+
if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
#
# For unicode, UINT16 array will be generated, so the alignment of unicode is guaranteed.
@@ -1129,7 +1129,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s UINT8 %s%s = %s;\n' % (Const, PcdVariableName, Array, Value))
AutoGenH.Append('extern %s UINT8 %s%s;\n' %(Const, PcdVariableName, Array))
AutoGenH.Append('#define %s %s%s\n' %(GetModeName, Type, PcdVariableName))
-
+
PcdDataSize = Pcd.GetPcdSize()
if Pcd.Type == TAB_PCDS_FIXED_AT_BUILD:
AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
@@ -1146,10 +1146,10 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
AutoGenC.Append('volatile %s %s %s = %s;\n' %(Const, Pcd.DatumType, PcdVariableName, PcdValueName))
AutoGenH.Append('extern volatile %s %s %s%s;\n' % (Const, Pcd.DatumType, PcdVariableName, Array))
AutoGenH.Append('#define %s %s%s\n' % (GetModeName, Type, PcdVariableName))
-
+
PcdDataSize = Pcd.GetPcdSize()
AutoGenH.Append('#define %s %s\n' % (PatchPcdSizeTokenName, PcdDataSize))
-
+
AutoGenH.Append('#define %s %s \n' % (GetModeSizeName,PatchPcdSizeVariableName))
AutoGenH.Append('extern UINTN %s; \n' % PatchPcdSizeVariableName)
AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (PatchPcdSizeVariableName,PcdDataSize))
@@ -1157,7 +1157,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
PcdDataSize = Pcd.GetPcdSize()
AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
AutoGenH.Append('#define %s %s \n' % (GetModeSizeName,FixPcdSizeTokenName))
-
+
AutoGenH.Append('#define %s %s\n' %(PcdValueName, Value))
AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s %s = %s;\n' %(Const, Pcd.DatumType, PcdVariableName, PcdValueName))
AutoGenH.Append('extern %s %s %s%s;\n' % (Const, Pcd.DatumType, PcdVariableName, Array))
@@ -1204,13 +1204,13 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
TokenNumber = int(Pcd.TokenValue, 0)
else:
if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in PcdTokenNumber:
- # If one of the Source built modules listed in the DSC is not listed in FDF modules,
- # and the INF lists a PCD can only use the PcdsDynamic access method (it is only
- # listed in the DEC file that declares the PCD as PcdsDynamic), then build tool will
- # report warning message notify the PI that they are attempting to build a module
- # that must be included in a flash image in order to be functional. These Dynamic PCD
- # will not be added into the Database unless it is used by other modules that are
- # included in the FDF file.
+ # If one of the Source built modules listed in the DSC is not listed in FDF modules,
+ # and the INF lists a PCD can only use the PcdsDynamic access method (it is only
+ # listed in the DEC file that declares the PCD as PcdsDynamic), then build tool will
+ # report warning message notify the PI that they are attempting to build a module
+ # that must be included in a flash image in order to be functional. These Dynamic PCD
+ # will not be added into the Database unless it is used by other modules that are
+ # included in the FDF file.
# In this case, just assign an invalid token number to make it pass build.
if Pcd.Type in PCD_DYNAMIC_TYPE_LIST:
TokenNumber = 0
@@ -1244,7 +1244,7 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
if PcdItemType in gDynamicExPcd:
PcdExTokenName = '_PCD_TOKEN_' + TokenSpaceGuidCName + '_' + TokenCName
AutoGenH.Append('\n#define %s %dU\n' % (PcdExTokenName, TokenNumber))
-
+
if Info.IsLibrary:
PcdList = Info.LibraryPcdList
else:
@@ -1326,7 +1326,7 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
AutoGenH.Append('#define %s %s\n' % (GetModeSizeName,PatchPcdSizeVariableName))
AutoGenH.Append('extern UINTN %s; \n' % PatchPcdSizeVariableName)
-
+
if PcdItemType == TAB_PCDS_FIXED_AT_BUILD or PcdItemType == TAB_PCDS_FEATURE_FLAG:
key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
PcdVariableName = '_gPcd_' + gItemTypeStringDatabase[Pcd.Type] + '_' + TokenCName
@@ -1337,7 +1337,7 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
AutoGenH.Append('extern const %s _gPcd_FixedAtBuild_%s%s;\n' %(DatumType, TokenCName, Array))
AutoGenH.Append('#define %s %s_gPcd_FixedAtBuild_%s\n' %(GetModeName, Type, TokenCName))
AutoGenH.Append('//#define %s ASSERT(FALSE) // It is not allowed to set value for a FIXED_AT_BUILD PCD\n' % SetModeName)
-
+
ConstFixedPcd = False
if PcdItemType == TAB_PCDS_FIXED_AT_BUILD and (key in Info.ConstPcd or (Info.IsLibrary and not Info._ReferenceModules)):
ConstFixedPcd = True
@@ -1670,7 +1670,7 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
for Pcd in Info.ModulePcdList:
if Pcd.Type in gDynamicExPcd and Pcd.TokenSpaceGuidCName not in TokenSpaceList:
TokenSpaceList += [Pcd.TokenSpaceGuidCName]
-
+
SkuMgr = Info.Workspace.Platform.SkuIdMgr
AutoGenH.Append("\n// Definition of SkuId Array\n")
AutoGenH.Append("extern UINT64 _gPcd_SkuId_Array[];\n")
@@ -1680,7 +1680,7 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
if Info.ModuleType in ["USER_DEFINED", "BASE"]:
GuidType = "GUID"
else:
- GuidType = "EFI_GUID"
+ GuidType = "EFI_GUID"
for Item in TokenSpaceList:
AutoGenH.Append('extern %s %s;\n' % (GuidType, Item))
@@ -2032,7 +2032,7 @@ def CreateHeaderCode(Info, AutoGenC, AutoGenH):
and gModuleTypeHeaderFile[Info.ModuleType][0] != gBasicHeaderFile:
AutoGenH.Append("#include <%s>\n" % gModuleTypeHeaderFile[Info.ModuleType][0])
#
- # if either PcdLib in [LibraryClasses] sections or there exist Pcd section, add PcdLib.h
+ # if either PcdLib in [LibraryClasses] sections or there exist Pcd section, add PcdLib.h
# As if modules only uses FixedPcd, then PcdLib is not needed in [LibraryClasses] section.
#
if 'PcdLib' in Info.Module.LibraryClasses or Info.Module.Pcds:
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 9280eeee641c..ef6647a15302 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -183,10 +183,10 @@ typedef struct {
//UINT32 UninitDataBaseSize;// Total size for PCD those default value with 0.
//TABLE_OFFSET LocalTokenNumberTableOffset;
//TABLE_OFFSET ExMapTableOffset;
- //TABLE_OFFSET GuidTableOffset;
+ //TABLE_OFFSET GuidTableOffset;
//TABLE_OFFSET StringTableOffset;
//TABLE_OFFSET SizeTableOffset;
- //TABLE_OFFSET SkuIdTableOffset;
+ //TABLE_OFFSET SkuIdTableOffset;
//TABLE_OFFSET PcdNameTableOffset;
//UINT16 LocalTokenCount; // LOCAL_TOKEN_NUMBER for all
//UINT16 ExTokenCount; // EX_TOKEN_NUMBER for DynamicEx
@@ -238,11 +238,11 @@ ${PHASE}_PCD_DATABASE_INIT g${PHASE}PcdDbInit = {
## DbItemList
#
-# The class holds the Pcd database items. ItemSize if not zero should match the item datum type in the C structure.
+# The class holds the Pcd database items. ItemSize if not zero should match the item datum type in the C structure.
# When the structure is changed, remember to check the ItemSize and the related PackStr in PackData()
-# RawDataList is the RawData that may need some kind of calculation or transformation,
+# RawDataList is the RawData that may need some kind of calculation or transformation,
# the DataList corresponds to the data that need to be written to database. If DataList is not present, then RawDataList
-# will be written to the database.
+# will be written to the database.
#
class DbItemList:
def __init__(self, ItemSize, DataList=None, RawDataList=None):
@@ -325,7 +325,7 @@ class DbItemList:
## DbExMapTblItemList
#
-# The class holds the ExMap table
+# The class holds the ExMap table
#
class DbExMapTblItemList (DbItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None):
@@ -335,15 +335,15 @@ class DbExMapTblItemList (DbItemList):
Buffer = ''
PackStr = "=LHH"
for Datas in self.RawDataList:
- Buffer += pack(PackStr,
+ Buffer += pack(PackStr,
GetIntegerValue(Datas[0]),
GetIntegerValue(Datas[1]),
- GetIntegerValue(Datas[2]))
+ GetIntegerValue(Datas[2]))
return Buffer
## DbComItemList
#
-# The DbComItemList is a special kind of DbItemList in case that the size of the List can not be computed by the
+# The DbComItemList is a special kind of DbItemList in case that the size of the List can not be computed by the
# ItemSize multiply the ItemCount.
#
class DbComItemList (DbItemList):
@@ -361,7 +361,7 @@ class DbComItemList (DbItemList):
else:
assert(Index < len(self.RawDataList))
for ItemIndex in xrange(Index):
- Offset += len(self.RawDataList[ItemIndex]) * self.ItemSize
+ Offset += len(self.RawDataList[ItemIndex]) * self.ItemSize
return Offset
@@ -400,12 +400,12 @@ class DbComItemList (DbItemList):
Buffer += pack(PackStr, GetIntegerValue(SingleData))
else:
Buffer += pack(PackStr, GetIntegerValue(Data))
-
+
return Buffer
## DbVariableTableItemList
#
-# The class holds the Variable header value table
+# The class holds the Variable header value table
#
class DbVariableTableItemList (DbComItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None):
@@ -416,7 +416,7 @@ class DbVariableTableItemList (DbComItemList):
Buffer = ''
for DataList in self.RawDataList:
for Data in DataList:
- Buffer += pack(PackStr,
+ Buffer += pack(PackStr,
GetIntegerValue(Data[0]),
GetIntegerValue(Data[1]),
GetIntegerValue(Data[2]),
@@ -429,7 +429,7 @@ class DbVariableTableItemList (DbComItemList):
class DbStringHeadTableItemList(DbItemList):
def __init__(self,ItemSize,DataList=None,RawDataList=None):
DbItemList.__init__(self, ItemSize, DataList, RawDataList)
-
+
def GetInterOffset(self, Index):
Offset = 0
if self.ItemSize == 0:
@@ -462,11 +462,11 @@ class DbStringHeadTableItemList(DbItemList):
self.ListSize += len(Datas) * self.ItemSize
else:
self.ListSize += self.ItemSize
- return self.ListSize
+ return self.ListSize
## DbSkuHeadTableItemList
#
-# The class holds the Sku header value table
+# The class holds the Sku header value table
#
class DbSkuHeadTableItemList (DbItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None):
@@ -476,14 +476,14 @@ class DbSkuHeadTableItemList (DbItemList):
PackStr = "=LL"
Buffer = ''
for Data in self.RawDataList:
- Buffer += pack(PackStr,
+ Buffer += pack(PackStr,
GetIntegerValue(Data[0]),
GetIntegerValue(Data[1]))
return Buffer
## DbSizeTableItemList
#
-# The class holds the size table
+# The class holds the size table
#
class DbSizeTableItemList (DbItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None):
@@ -498,16 +498,16 @@ class DbSizeTableItemList (DbItemList):
PackStr = "=H"
Buffer = ''
for Data in self.RawDataList:
- Buffer += pack(PackStr,
+ Buffer += pack(PackStr,
GetIntegerValue(Data[0]))
for subData in Data[1]:
- Buffer += pack(PackStr,
+ Buffer += pack(PackStr,
GetIntegerValue(subData))
return Buffer
## DbStringItemList
#
-# The class holds the string table
+# The class holds the string table
#
class DbStringItemList (DbComItemList):
def __init__(self, ItemSize, DataList=None, RawDataList=None, LenList=None):
@@ -517,7 +517,7 @@ class DbStringItemList (DbComItemList):
RawDataList = []
if LenList is None:
LenList = []
-
+
assert(len(RawDataList) == len(LenList))
DataList = []
# adjust DataList according to the LenList
@@ -576,7 +576,7 @@ def GetMatchedIndex(Key1, List1, Key2, List2):
return Index
else:
StartPos = Index + 1
-
+
return -1
@@ -584,7 +584,7 @@ def GetMatchedIndex(Key1, List1, Key2, List2):
# to List like [0x36, 0x00, 0x34, 0x00, 0x21, 0x00, 0x36, 0x00, 0x34, 0x00, 0x00, 0x00]
#
# @param StringArray A string array like {0x36, 0x00, 0x34, 0x00, 0x21, 0x00, 0x36, 0x00, 0x34, 0x00, 0x00, 0x00}
-#
+#
# @retval A list object of integer items
#
def StringArrayToList(StringArray):
@@ -596,7 +596,7 @@ def StringArrayToList(StringArray):
## Convert TokenType String like "PCD_DATUM_TYPE_UINT32 | PCD_TYPE_HII" to TokenType value
#
# @param TokenType A TokenType string like "PCD_DATUM_TYPE_UINT32 | PCD_TYPE_HII"
-#
+#
# @retval A integer representation of the TokenType
#
def GetTokenTypeValue(TokenType):
@@ -623,7 +623,7 @@ def GetTokenTypeValue(TokenType):
## construct the external Pcd database using data from Dict
#
# @param Dict A dictionary contains Pcd related tables
-#
+#
# @retval Buffer A byte stream of the Pcd database
#
def BuildExDataBase(Dict):
@@ -652,26 +652,26 @@ def BuildExDataBase(Dict):
NumberOfSkuEnabledPcd = GetIntegerValue(Dict['SKU_HEAD_SIZE'])
Dict['STRING_TABLE_DB_VALUE'] = [StringArrayToList(x) for x in Dict['STRING_TABLE_VALUE']]
-
+
StringTableValue = Dict['STRING_TABLE_DB_VALUE']
# when calcute the offset, should use StringTableLen instead of StringTableValue, as string maxium len may be different with actual len
StringTableLen = Dict['STRING_TABLE_LENGTH']
DbStringTableLen = DbStringItemList(0, RawDataList = StringTableValue, LenList = StringTableLen)
-
+
PcdTokenTable = Dict['PCD_TOKENSPACE']
PcdTokenLen = Dict['PCD_TOKENSPACE_LENGTH']
PcdTokenTableValue = [StringArrayToList(x) for x in Dict['PCD_TOKENSPACE']]
DbPcdTokenTable = DbStringItemList(0, RawDataList = PcdTokenTableValue, LenList = PcdTokenLen)
-
+
PcdCNameTable = Dict['PCD_CNAME']
PcdCNameLen = Dict['PCD_CNAME_LENGTH']
PcdCNameTableValue = [StringArrayToList(x) for x in Dict['PCD_CNAME']]
DbPcdCNameTable = DbStringItemList(0, RawDataList = PcdCNameTableValue, LenList = PcdCNameLen)
-
+
PcdNameOffsetTable = Dict['PCD_NAME_OFFSET']
DbPcdNameOffsetTable = DbItemList(4,RawDataList = PcdNameOffsetTable)
-
+
SizeTableValue = zip(Dict['SIZE_TABLE_MAXIMUM_LENGTH'], Dict['SIZE_TABLE_CURRENT_LENGTH'])
DbSizeTableValue = DbSizeTableItemList(2, RawDataList = SizeTableValue)
InitValueUint16 = Dict['INIT_DB_VALUE_UINT16']
@@ -690,7 +690,7 @@ def BuildExDataBase(Dict):
DbSkuidValue = DbItemList(8, RawDataList = SkuidValue)
-
+
# Unit Db Items
UnInitValueUint64 = Dict['UNINIT_GUID_DECL_UINT64']
DbUnInitValueUint64 = DbItemList(8, RawDataList = UnInitValueUint64)
@@ -703,12 +703,12 @@ def BuildExDataBase(Dict):
UnInitValueBoolean = Dict['UNINIT_GUID_DECL_BOOLEAN']
DbUnInitValueBoolean = DbItemList(1, RawDataList = UnInitValueBoolean)
PcdTokenNumberMap = Dict['PCD_ORDER_TOKEN_NUMBER_MAP']
-
+
DbNameTotle = ["SkuidValue", "InitValueUint64", "VardefValueUint64", "InitValueUint32", "VardefValueUint32", "VpdHeadValue", "ExMapTable",
"LocalTokenNumberTable", "GuidTable", "StringHeadValue", "PcdNameOffsetTable","VariableTable", "StringTableLen", "PcdTokenTable", "PcdCNameTable",
"SizeTableValue", "InitValueUint16", "VardefValueUint16", "InitValueUint8", "VardefValueUint8", "InitValueBoolean",
"VardefValueBoolean", "UnInitValueUint64", "UnInitValueUint32", "UnInitValueUint16", "UnInitValueUint8", "UnInitValueBoolean"]
-
+
DbTotal = [SkuidValue, InitValueUint64, VardefValueUint64, InitValueUint32, VardefValueUint32, VpdHeadValue, ExMapTable,
LocalTokenNumberTable, GuidTable, StringHeadValue, PcdNameOffsetTable,VariableTable, StringTableLen, PcdTokenTable,PcdCNameTable,
SizeTableValue, InitValueUint16, VardefValueUint16, InitValueUint8, VardefValueUint8, InitValueBoolean,
@@ -717,21 +717,21 @@ def BuildExDataBase(Dict):
DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue, DbPcdNameOffsetTable,DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbPcdCNameTable,
DbSizeTableValue, DbInitValueUint16, DbVardefValueUint16, DbInitValueUint8, DbVardefValueUint8, DbInitValueBoolean,
DbVardefValueBoolean, DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean]
-
+
# VardefValueBoolean is the last table in the init table items
InitTableNum = DbNameTotle.index("VardefValueBoolean") + 1
# The FixedHeader length of the PCD_DATABASE_INIT, from Signature to Pad
FixedHeaderLen = 80
- # Get offset of SkuId table in the database
+ # Get offset of SkuId table in the database
SkuIdTableOffset = FixedHeaderLen
for DbIndex in xrange(len(DbTotal)):
if DbTotal[DbIndex] is SkuidValue:
break
SkuIdTableOffset += DbItemTotal[DbIndex].GetListSize()
-
-
- # Get offset of SkuValue table in the database
+
+
+ # Get offset of SkuValue table in the database
# Fix up the LocalTokenNumberTable, SkuHeader table
for (LocalTokenNumberTableIndex, (Offset, Table)) in enumerate(LocalTokenNumberTable):
@@ -752,11 +752,11 @@ def BuildExDataBase(Dict):
TokenTypeValue = GetTokenTypeValue(TokenTypeValue)
LocalTokenNumberTable[LocalTokenNumberTableIndex] = DbOffset|int(TokenTypeValue)
# if PCD_TYPE_SKU_ENABLED, then we need to fix up the SkuTable
-
-
-
- # resolve variable table offset
+
+
+
+ # resolve variable table offset
for VariableEntries in VariableTable:
skuindex = 0
for VariableEntryPerSku in VariableEntries:
@@ -774,7 +774,7 @@ def BuildExDataBase(Dict):
else:
assert(False)
if isinstance(VariableRefTable[0],list):
- DbOffset += skuindex * 4
+ DbOffset += skuindex * 4
skuindex += 1
if DbIndex >= InitTableNum:
assert(False)
@@ -802,28 +802,28 @@ def BuildExDataBase(Dict):
DbTotalLength += DbItemTotal[DbIndex].GetListSize()
if not Dict['PCD_INFO_FLAG']:
- DbPcdNameOffset = 0
+ DbPcdNameOffset = 0
LocalTokenCount = GetIntegerValue(Dict['LOCAL_TOKEN_NUMBER'])
ExTokenCount = GetIntegerValue(Dict['EX_TOKEN_NUMBER'])
GuidTableCount = GetIntegerValue(Dict['GUID_TABLE_SIZE'])
SystemSkuId = GetIntegerValue(Dict['SYSTEM_SKU_ID_VALUE'])
Pad = 0xDA
-
+
UninitDataBaseSize = 0
for Item in (DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean):
UninitDataBaseSize += Item.GetListSize()
-
+
if (DbTotalLength - UninitDataBaseSize) % 8:
DbTotalLength += (8 - (DbTotalLength - UninitDataBaseSize) % 8)
# Construct the database buffer
Guid = "{0x3c7d193c, 0x682c, 0x4c14, 0xa6, 0x8f, 0x55, 0x2d, 0xea, 0x4f, 0x43, 0x7e}"
Guid = StringArrayToList(Guid)
- Buffer = pack('=LHHBBBBBBBB',
- Guid[0],
- Guid[1],
- Guid[2],
- Guid[3],
- Guid[4],
+ Buffer = pack('=LHHBBBBBBBB',
+ Guid[0],
+ Guid[1],
+ Guid[2],
+ Guid[3],
+ Guid[4],
Guid[5],
Guid[6],
Guid[7],
@@ -851,7 +851,7 @@ def BuildExDataBase(Dict):
Buffer += b
b = pack('=L', ExMapTableOffset)
-
+
Buffer += b
b = pack('=L', GuidTableOffset)
@@ -875,7 +875,7 @@ def BuildExDataBase(Dict):
Buffer += b
b = pack('=H', GuidTableCount)
-
+
Buffer += b
b = pack('=B', Pad)
Buffer += b
@@ -884,18 +884,18 @@ def BuildExDataBase(Dict):
Buffer += b
Buffer += b
Buffer += b
-
+
Index = 0
for Item in DbItemTotal:
Index +=1
b = Item.PackData()
- Buffer += b
+ Buffer += b
if Index == InitTableNum:
if len(Buffer) % 8:
for num in range(8 - len(Buffer) % 8):
b = pack('=B', Pad)
Buffer += b
- break
+ break
return Buffer
## Create code for PCD database
@@ -1049,7 +1049,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
'SYSTEM_SKU_ID' : ' SKU_ID SystemSkuId;',
'SYSTEM_SKU_ID_VALUE' : '0U'
}
-
+
SkuObj = Platform.Platform.SkuIdMgr
Dict['SYSTEM_SKU_ID_VALUE'] = 0 if SkuObj.SkuUsageType == SkuObj.SINGLE else Platform.Platform.SkuIds[SkuObj.SystemSkuId][0]
@@ -1067,7 +1067,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict[Init+'_NUMSKUS_DECL_' + DatumType] = []
Dict[Init+'_VALUE_' + DatumType] = []
Dict[Init+'_DB_VALUE_'+DatumType] = []
-
+
for Type in ['STRING_HEAD','VPD_HEAD','VARIABLE_HEAD']:
Dict[Type + '_CNAME_DECL'] = []
Dict[Type + '_GUID_DECL'] = []
@@ -1077,7 +1077,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['STRING_DB_VALUE'] = []
Dict['VPD_DB_VALUE'] = []
Dict['VARIABLE_DB_VALUE'] = []
-
+
Dict['STRING_TABLE_INDEX'] = []
Dict['STRING_TABLE_LENGTH'] = []
Dict['STRING_TABLE_CNAME'] = []
@@ -1100,19 +1100,19 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['LOCAL_TOKEN_NUMBER_DB_VALUE'] = []
Dict['VARIABLE_DB_VALUE'] = []
-
+
Dict['PCD_TOKENSPACE'] = []
- Dict['PCD_CNAME'] = []
+ Dict['PCD_CNAME'] = []
Dict['PCD_TOKENSPACE_LENGTH'] = []
Dict['PCD_CNAME_LENGTH'] = []
Dict['PCD_TOKENSPACE_OFFSET'] = []
Dict['PCD_CNAME_OFFSET'] = []
Dict['PCD_TOKENSPACE_MAP'] = []
Dict['PCD_NAME_OFFSET'] = []
-
+
Dict['PCD_ORDER_TOKEN_NUMBER_MAP'] = {}
PCD_STRING_INDEX_MAP = {}
-
+
StringTableIndex = 0
StringTableSize = 0
NumberOfLocalTokens = 0
@@ -1181,8 +1181,8 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
if len(Pcd.SkuInfoList) > 1:
# Pcd.TokenTypeList += ['PCD_TYPE_SKU_ENABLED']
NumberOfSkuEnabledPcd += 1
-
- SkuIdIndex = 1
+
+ SkuIdIndex = 1
VariableHeadList = []
for SkuName in Pcd.SkuInfoList:
Sku = Pcd.SkuInfoList[SkuName]
@@ -1190,9 +1190,9 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
if SkuId is None or SkuId == '':
continue
-
+
SkuIdIndex += 1
-
+
if len(Sku.VariableName) > 0:
VariableGuidStructure = Sku.VariableGuidValue
VariableGuid = GuidStructureStringToGuidValueName(VariableGuidStructure)
@@ -1243,7 +1243,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
for Index in range(Dict['STRING_TABLE_VALUE'].index(VariableNameStructure)):
VariableHeadStringIndex += Dict['STRING_TABLE_LENGTH'][Index]
VariableHeadList.append(VariableHeadStringIndex)
-
+
VariableHeadStringIndex = VariableHeadList[SkuIdIndex - 2]
# store VariableGuid to GuidTable and get the VariableHeadGuidIndex
@@ -1254,11 +1254,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
if "PCD_TYPE_STRING" in Pcd.TokenTypeList:
VariableHeadValueList.append('%dU, offsetof(%s_PCD_DATABASE, Init.%s_%s), %dU, %sU' %
- (VariableHeadStringIndex, Phase, CName, TokenSpaceGuid,
+ (VariableHeadStringIndex, Phase, CName, TokenSpaceGuid,
VariableHeadGuidIndex, Sku.VariableOffset))
else:
VariableHeadValueList.append('%dU, offsetof(%s_PCD_DATABASE, Init.%s_%s_VariableDefault_%s), %dU, %sU' %
- (VariableHeadStringIndex, Phase, CName, TokenSpaceGuid, SkuIdIndex,
+ (VariableHeadStringIndex, Phase, CName, TokenSpaceGuid, SkuIdIndex,
VariableHeadGuidIndex, Sku.VariableOffset))
Dict['VARDEF_CNAME_'+Pcd.DatumType].append(CName)
Dict['VARDEF_GUID_'+Pcd.DatumType].append(TokenSpaceGuid)
@@ -1271,7 +1271,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
# warning under linux building environment.
#
Dict['VARDEF_DB_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue)
-
+
if Pcd.DatumType == TAB_UINT64:
Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "ULL")
elif Pcd.DatumType in (TAB_UINT32, TAB_UINT16, TAB_UINT8):
@@ -1304,13 +1304,13 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Pcd.InitString = 'INIT'
VpdHeadOffsetList.append(str(Sku.VpdOffset) + 'U')
VpdDbOffsetList.append(Sku.VpdOffset)
- # Also add the VOID* string of VPD PCD to SizeTable
+ # Also add the VOID* string of VPD PCD to SizeTable
if Pcd.DatumType == TAB_VOID:
NumberOfSizeItems += 1
# For VPD type of PCD, its current size is equal to its MAX size.
- VoidStarTypeCurrSize = [str(Pcd.MaxDatumSize) + 'U']
+ VoidStarTypeCurrSize = [str(Pcd.MaxDatumSize) + 'U']
continue
-
+
if Pcd.DatumType == TAB_VOID:
Pcd.TokenTypeList += ['PCD_TYPE_STRING']
Pcd.InitString = 'INIT'
@@ -1337,7 +1337,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
Size = len(Sku.DefaultValue.split(","))
Dict['STRING_TABLE_VALUE'].append(DefaultValueBinStructure)
-
+
StringHeadOffsetList.append(str(StringTableSize) + 'U')
StringDbOffsetList.append(StringTableSize)
if Pcd.MaxDatumSize != '':
@@ -1376,10 +1376,10 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
ValueList.append(Sku.DefaultValue + "U")
elif Pcd.DatumType == "BOOLEAN":
if Sku.DefaultValue in ["1", "0"]:
- ValueList.append(Sku.DefaultValue + "U")
+ ValueList.append(Sku.DefaultValue + "U")
else:
ValueList.append(Sku.DefaultValue)
-
+
DbValueList.append(Sku.DefaultValue)
Pcd.TokenTypeList = list(set(Pcd.TokenTypeList))
@@ -1388,8 +1388,8 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['SIZE_TABLE_GUID'].append(TokenSpaceGuid)
Dict['SIZE_TABLE_MAXIMUM_LENGTH'].append(str(Pcd.MaxDatumSize) + 'U')
Dict['SIZE_TABLE_CURRENT_LENGTH'].append(VoidStarTypeCurrSize)
-
-
+
+
if 'PCD_TYPE_HII' in Pcd.TokenTypeList:
Dict['VARIABLE_HEAD_CNAME_DECL'].append(CName)
@@ -1422,7 +1422,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
else:
Dict[Pcd.InitString+'_VALUE_'+Pcd.DatumType].append(', '.join(ValueList))
Dict[Pcd.InitString+'_DB_VALUE_'+Pcd.DatumType].append(DbValueList)
-
+
if Phase == 'PEI':
NumberOfLocalTokens = NumberOfPeiLocalTokens
if Phase == 'DXE':
@@ -1434,7 +1434,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['TOKEN_TYPE'] = ['' for x in range(NumberOfLocalTokens)]
Dict['LOCAL_TOKEN_NUMBER_DB_VALUE'] = ['' for x in range(NumberOfLocalTokens)]
Dict['PCD_CNAME'] = ['' for x in range(NumberOfLocalTokens)]
- Dict['PCD_TOKENSPACE_MAP'] = ['' for x in range(NumberOfLocalTokens)]
+ Dict['PCD_TOKENSPACE_MAP'] = ['' for x in range(NumberOfLocalTokens)]
Dict['PCD_CNAME_LENGTH'] = [0 for x in range(NumberOfLocalTokens)]
SkuEnablePcdIndex = 0
for Pcd in ReorderedDynPcdList:
@@ -1459,7 +1459,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
EdkLogger.debug(EdkLogger.DEBUG_1, "PCD = %s.%s" % (CName, TokenSpaceGuidCName))
EdkLogger.debug(EdkLogger.DEBUG_1, "phase = %s" % Phase)
EdkLogger.debug(EdkLogger.DEBUG_1, "GeneratedTokenNumber = %s" % str(GeneratedTokenNumber))
-
+
#
# following four Dict items hold the information for LocalTokenNumberTable
#
@@ -1470,7 +1470,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['TOKEN_CNAME'][GeneratedTokenNumber] = CName
Dict['TOKEN_GUID'][GeneratedTokenNumber] = TokenSpaceGuid
Dict['TOKEN_TYPE'][GeneratedTokenNumber] = ' | '.join(Pcd.TokenTypeList)
-
+
if Platform.Platform.PcdInfoFlag:
TokenSpaceGuidCNameArray = StringToArray('"' + TokenSpaceGuidCName + '"' )
if TokenSpaceGuidCNameArray not in Dict['PCD_TOKENSPACE']:
@@ -1479,10 +1479,10 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['PCD_TOKENSPACE_MAP'][GeneratedTokenNumber] = Dict['PCD_TOKENSPACE'].index(TokenSpaceGuidCNameArray)
CNameBinArray = StringToArray('"' + CName + '"' )
Dict['PCD_CNAME'][GeneratedTokenNumber] = CNameBinArray
-
+
Dict['PCD_CNAME_LENGTH'][GeneratedTokenNumber] = len(CNameBinArray.split(","))
-
-
+
+
Pcd.TokenTypeList = list(set(Pcd.TokenTypeList))
# search the Offset and Table, used by LocalTokenNumberTableOffset
@@ -1508,7 +1508,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
if Pcd.InitString == 'UNINIT':
Table = Dict[Pcd.InitString+'_GUID_DECL_'+Pcd.DatumType]
else:
- Table = Dict[Pcd.InitString+'_DB_VALUE_'+Pcd.DatumType]
+ Table = Dict[Pcd.InitString+'_DB_VALUE_'+Pcd.DatumType]
Dict['LOCAL_TOKEN_NUMBER_DB_VALUE'][GeneratedTokenNumber] = (Offset, Table)
#
@@ -1518,10 +1518,10 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['VARDEF_HEADER'][GeneratedTokenNumber] = '_Variable_Header'
else:
Dict['VARDEF_HEADER'][GeneratedTokenNumber] = ''
-
-
+
+
if Pcd.Type in gDynamicExPcd:
-
+
if Phase == 'DXE':
GeneratedTokenNumber += NumberOfPeiLocalTokens
#
@@ -1533,7 +1533,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
# Therefore, 1 is added to GeneratedTokenNumber to generate a PCD Token Number before being inserted
# to the EXMAPPING_TABLE.
#
-
+
Dict['EXMAPPING_TABLE_EXTOKEN'].append(str(Pcd.TokenValue) + 'U')
Dict['EXMAPPING_TABLE_LOCAL_TOKEN'].append(str(GeneratedTokenNumber + 1) + 'U')
@@ -1544,12 +1544,12 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
TokenSpaceIndex = StringTableSize
for i in range(Dict['PCD_TOKENSPACE_MAP'][index]):
TokenSpaceIndex += Dict['PCD_TOKENSPACE_LENGTH'][i]
- Dict['PCD_TOKENSPACE_OFFSET'].append(TokenSpaceIndex)
+ Dict['PCD_TOKENSPACE_OFFSET'].append(TokenSpaceIndex)
for index in range(len(Dict['PCD_TOKENSPACE'])):
StringTableSize += Dict['PCD_TOKENSPACE_LENGTH'][index]
StringTableIndex += 1
for index in range(len(Dict['PCD_CNAME'])):
- Dict['PCD_CNAME_OFFSET'].append(StringTableSize)
+ Dict['PCD_CNAME_OFFSET'].append(StringTableSize)
Dict['PCD_NAME_OFFSET'].append(Dict['PCD_TOKENSPACE_OFFSET'][index])
Dict['PCD_NAME_OFFSET'].append(StringTableSize)
StringTableSize += Dict['PCD_CNAME_LENGTH'][index]
@@ -1592,15 +1592,15 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
if NumberOfSizeItems != 0:
Dict['SIZE_TABLE_SIZE'] = str(NumberOfSizeItems * 2) + 'U'
-
- if NumberOfSkuEnabledPcd != 0:
+
+ if NumberOfSkuEnabledPcd != 0:
Dict['SKU_HEAD_SIZE'] = str(NumberOfSkuEnabledPcd) + 'U'
-
+
for AvailableSkuNumber in SkuObj.SkuIdNumberSet:
if AvailableSkuNumber not in Dict['SKUID_VALUE']:
Dict['SKUID_VALUE'].append(AvailableSkuNumber)
Dict['SKUID_VALUE'][0] = len(Dict['SKUID_VALUE']) - 1
-
+
AutoGenH.Append(gPcdDatabaseAutoGenH.Replace(Dict))
if NumberOfLocalTokens == 0:
AutoGenC.Append(gEmptyPcdDatabaseAutoGenC.Replace(Dict))
@@ -1613,11 +1613,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
SizeCurLenTempList = []
SizeMaxLenTempList = []
ReOrderFlag = True
-
+
if len(Dict['SIZE_TABLE_CNAME']) == 1:
if not (Dict['SIZE_TABLE_CNAME'][0] and Dict['SIZE_TABLE_GUID'][0]):
ReOrderFlag = False
-
+
if ReOrderFlag:
for Count in range(len(Dict['TOKEN_CNAME'])):
for Count1 in range(len(Dict['SIZE_TABLE_CNAME'])):
@@ -1627,15 +1627,15 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
SizeGuidTempList.append(Dict['SIZE_TABLE_GUID'][Count1])
SizeCurLenTempList.append(Dict['SIZE_TABLE_CURRENT_LENGTH'][Count1])
SizeMaxLenTempList.append(Dict['SIZE_TABLE_MAXIMUM_LENGTH'][Count1])
-
+
for Count in range(len(Dict['SIZE_TABLE_CNAME'])):
Dict['SIZE_TABLE_CNAME'][Count] = SizeCNameTempList[Count]
Dict['SIZE_TABLE_GUID'][Count] = SizeGuidTempList[Count]
Dict['SIZE_TABLE_CURRENT_LENGTH'][Count] = SizeCurLenTempList[Count]
Dict['SIZE_TABLE_MAXIMUM_LENGTH'][Count] = SizeMaxLenTempList[Count]
-
+
AutoGenC.Append(gPcdDatabaseAutoGenC.Replace(Dict))
-
+
# print Phase
Buffer = BuildExDataBase(Dict)
diff --git a/BaseTools/Source/Python/AutoGen/InfSectionParser.py b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
index 2cd5a6667a02..d98508973841 100644
--- a/BaseTools/Source/Python/AutoGen/InfSectionParser.py
+++ b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
@@ -17,14 +17,14 @@
import Common.EdkLogger as EdkLogger
from Common.BuildToolError import *
from Common.DataType import *
-
+
class InfSectionParser():
def __init__(self, FilePath):
self._FilePath = FilePath
self._FileSectionDataList = []
self._ParserInf()
-
+
def _ParserInf(self):
FileLinesList = []
UserExtFind = False
@@ -32,12 +32,12 @@ class InfSectionParser():
FileLastLine = False
SectionLine = ''
SectionData = []
-
+
try:
FileLinesList = open(self._FilePath, "r", 0).readlines()
except BaseException:
EdkLogger.error("build", AUTOGEN_ERROR, 'File %s is opened failed.' % self._FilePath)
-
+
for Index in range(0, len(FileLinesList)):
line = str(FileLinesList[Index]).strip()
if Index + 1 == len(FileLinesList):
@@ -52,7 +52,7 @@ class InfSectionParser():
SectionLine = line
UserExtFind = True
FindEnd = False
-
+
if (NextLine != '' and NextLine[0] == TAB_SECTION_START and \
NextLine[-1] == TAB_SECTION_END) or FileLastLine:
UserExtFind = False
@@ -60,7 +60,7 @@ class InfSectionParser():
self._FileSectionDataList.append({SectionLine: SectionData[:]})
del SectionData[:]
SectionLine = ''
-
+
# Get user extension TianoCore data
#
# @return: a list include some dictionary that key is section and value is a list contain all data.
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index 73af1214eb0a..e97a3175e991 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -1,5 +1,5 @@
## @file
-# This file is used to parse a strings file and create or add to a string database
+# This file is used to parse a strings file and create or add to a string database
# file.
#
# Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -144,7 +144,7 @@ def CreateHFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniGenCFlag):
Str = WriteLine(Str, Line)
UnusedStr = ''
- #Group the referred/Unused STRING token together.
+ #Group the referred/Unused STRING token together.
for Index in range(2, len(UniObjectClass.OrderedStringList[UniObjectClass.LanguageDef[0][0]])):
StringItem = UniObjectClass.OrderedStringList[UniObjectClass.LanguageDef[0][0]][Index]
Name = StringItem.StringName
@@ -265,16 +265,16 @@ def GetFilteredLanguage(UniLanguageList, LanguageFilterList):
PrimaryTag = Language[0:Language.find('-')].lower()
else:
PrimaryTag = Language
-
+
if len(PrimaryTag) == 3:
PrimaryTag = LangConvTable.get(PrimaryTag)
-
+
for UniLanguage in UniLanguageList:
if UniLanguage.find('-') != -1:
UniLanguagePrimaryTag = UniLanguage[0:UniLanguage.find('-')].lower()
else:
UniLanguagePrimaryTag = UniLanguage
-
+
if len(UniLanguagePrimaryTag) == 3:
UniLanguagePrimaryTag = LangConvTable.get(UniLanguagePrimaryTag)
@@ -307,7 +307,7 @@ def GetFilteredLanguage(UniLanguageList, LanguageFilterList):
# @param UniObjectClass A UniObjectClass instance
# @param IsCompatibleMode Compatible mode
# @param UniBinBuffer UniBinBuffer to contain UniBinary data.
-# @param FilterInfo Platform language filter information
+# @param FilterInfo Platform language filter information
#
# @retval Str: A string of .c file content
#
@@ -325,14 +325,14 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
else:
# EDK module is using ISO639-2 format filter, convert to the RFC4646 format
LanguageFilterList = [LangConvTable.get(F.lower()) for F in FilterInfo[1]]
-
+
UniLanguageList = []
for IndexI in range(len(UniObjectClass.LanguageDef)):
UniLanguageList += [UniObjectClass.LanguageDef[IndexI][0]]
UniLanguageListFiltered = GetFilteredLanguage(UniLanguageList, LanguageFilterList)
-
-
+
+
#
# Create lines for each language's strings
#
@@ -340,7 +340,7 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
Language = UniObjectClass.LanguageDef[IndexI][0]
if Language not in UniLanguageListFiltered:
continue
-
+
StringBuffer = StringIO()
StrStringValue = ''
ArrayLength = 0
@@ -403,7 +403,7 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
# Add an EFI_HII_SIBT_END at last
#
Str = WriteLine(Str, ' ' + EFI_HII_SIBT_END + ",")
-
+
#
# Create binary UNI string
#
@@ -458,7 +458,7 @@ def CreateCFileEnd():
# @param BaseName: The basename of strings
# @param UniObjectClass A UniObjectClass instance
# @param IsCompatibleMode Compatible Mode
-# @param FilterInfo Platform language filter information
+# @param FilterInfo Platform language filter information
#
# @retval CFile: A string of complete .c file
#
@@ -544,7 +544,7 @@ def SearchString(UniObjectClass, FileList, IsCompatibleMode):
# This function is used for UEFI2.1 spec
#
#
-def GetStringFiles(UniFilList, SourceFileList, IncludeList, IncludePathList, SkipList, BaseName, IsCompatibleMode = False, ShellMode = False, UniGenCFlag = True, UniGenBinBuffer = None, FilterInfo = [True, []]):
+def GetStringFiles(UniFilList, SourceFileList, IncludeList, IncludePathList, SkipList, BaseName, IsCompatibleMode = False, ShellMode = False, UniGenCFlag = True, UniGenBinBuffer = None, FilterInfo = [True, []]):
if len(UniFilList) > 0:
if ShellMode:
#
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 54b6fb22a08a..ba451044f8e9 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -283,7 +283,7 @@ class UniFileClassObject(object):
if not IsLangInDef:
#
# The found STRING tokens will be added into new language string list
- # so that the unique STRING identifier is reserved for all languages in the package list.
+ # so that the unique STRING identifier is reserved for all languages in the package list.
#
FirstLangName = self.LanguageDef[0][0]
if LangName != FirstLangName:
@@ -410,10 +410,10 @@ class UniFileClassObject(object):
#
# Ignore empty line
#
- if len(Line) == 0:
- continue
-
-
+ if len(Line) == 0:
+ continue
+
+
Line = Line.replace(u'/langdef', u'#langdef')
Line = Line.replace(u'/string', u'#string')
Line = Line.replace(u'/language', u'#language')
@@ -428,8 +428,8 @@ class UniFileClassObject(object):
Line = Line.replace(u'\\r', CR)
Line = Line.replace(u'\\t', u' ')
Line = Line.replace(u'\t', u' ')
- Line = Line.replace(u'\\"', u'"')
- Line = Line.replace(u"\\'", u"'")
+ Line = Line.replace(u'\\"', u'"')
+ Line = Line.replace(u"\\'", u"'")
Line = Line.replace(BACK_SLASH_PLACEHOLDER, u'\\')
StartPos = Line.find(u'\\x')
@@ -569,7 +569,7 @@ class UniFileClassObject(object):
else:
EdkLogger.error('Unicode File Parser', FORMAT_NOT_SUPPORTED, "The language '%s' for %s is not defined in Unicode file %s." \
% (Language, Name, self.File))
-
+
if Language not in self.OrderedStringList:
self.OrderedStringList[Language] = []
self.OrderedStringDict[Language] = {}
@@ -591,7 +591,7 @@ class UniFileClassObject(object):
for LangName in self.LanguageDef:
#
# New STRING token will be added into all language string lists.
- # so that the unique STRING identifier is reserved for all languages in the package list.
+ # so that the unique STRING identifier is reserved for all languages in the package list.
#
if LangName[0] != Language:
if UseOtherLangDef != '':
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 92c8fe2df904..f5b1574e4440 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -24,7 +24,7 @@ from Common.DataType import *
class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
def __init__(self):
self.var_check_info = []
-
+
def push_back(self, var_check_tab):
for tab in self.var_check_info:
if tab.equal(var_check_tab):
@@ -32,15 +32,15 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
break
else:
self.var_check_info.append(var_check_tab)
-
+
def dump(self, dest, Phase):
-
+
FormatMap = {}
FormatMap[1] = "=B"
FormatMap[2] = "=H"
FormatMap[4] = "=L"
FormatMap[8] = "=Q"
-
+
if not os.path.isabs(dest):
return
if not os.path.exists(dest):
@@ -179,7 +179,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
b = pack("=B", var_check_tab.pad)
Buffer += b
realLength += 1
-
+
DbFile = StringIO()
if Phase == 'DXE' and os.path.exists(BinFilePath):
BinFile = open(BinFilePath, "rb")
@@ -193,7 +193,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
Buffer = BinBuffer + Buffer
DbFile.write(Buffer)
SaveFileOnChange(BinFilePath, DbFile.getvalue(), True)
-
+
class VAR_CHECK_PCD_VARIABLE_TAB(object):
pad = 0xDA
@@ -211,26 +211,26 @@ class VAR_CHECK_PCD_VARIABLE_TAB(object):
def UpdateSize(self):
self.HeaderLength = 32 + len(self.Name.split(","))
self.Length = 32 + len(self.Name.split(",")) + self.GetValidTabLen()
-
+
def GetValidTabLen(self):
validtablen = 0
for item in self.validtab:
- validtablen += item.Length
- return validtablen
-
+ validtablen += item.Length
+ return validtablen
+
def SetAttributes(self, attributes):
self.Attributes = attributes
-
+
def push_back(self, valid_obj):
if valid_obj is not None:
self.validtab.append(valid_obj)
-
+
def equal(self, varchecktab):
if self.Guid == varchecktab.Guid and self.Name == varchecktab.Name:
return True
else:
return False
-
+
def merge(self, varchecktab):
for validobj in varchecktab.validtab:
if validobj in self.validtab:
@@ -253,10 +253,10 @@ class VAR_CHECK_PCD_VALID_OBJ(object):
except:
self.StorageWidth = 0
self.ValidData = False
-
- def __eq__(self, validObj):
+
+ def __eq__(self, validObj):
return validObj and self.VarOffset == validObj.VarOffset
-
+
class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
def __init__(self, VarOffset, validlist, PcdDataType):
super(VAR_CHECK_PCD_VALID_LIST, self).__init__(VarOffset, validlist, PcdDataType)
@@ -264,7 +264,7 @@ class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
valid_num_list = []
for item in self.rawdata:
valid_num_list.extend(item.split(','))
-
+
for valid_num in valid_num_list:
valid_num = valid_num.strip()
@@ -273,10 +273,10 @@ class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
else:
self.data.add(int(valid_num))
-
+
self.Length = 5 + len(self.data) * self.StorageWidth
-
-
+
+
class VAR_CHECK_PCD_VALID_RANGE(VAR_CHECK_PCD_VALID_OBJ):
def __init__(self, VarOffset, validrange, PcdDataType):
super(VAR_CHECK_PCD_VALID_RANGE, self).__init__(VarOffset, validrange, PcdDataType)
@@ -293,7 +293,7 @@ class VAR_CHECK_PCD_VALID_RANGE(VAR_CHECK_PCD_VALID_OBJ):
for obj in rangelist.pop():
self.data.add((obj.start, obj.end))
self.Length = 5 + len(self.data) * 2 * self.StorageWidth
-
+
def GetValidationObject(PcdClass, VarOffset):
if PcdClass.validateranges:
diff --git a/BaseTools/Source/Python/BPDG/BPDG.py b/BaseTools/Source/Python/BPDG/BPDG.py
index 6c8f89f5d12b..4f7a73b7e688 100644
--- a/BaseTools/Source/Python/BPDG/BPDG.py
+++ b/BaseTools/Source/Python/BPDG/BPDG.py
@@ -1,9 +1,9 @@
## @file
# Intel Binary Product Data Generation Tool (Intel BPDG).
-# This tool provide a simple process for the creation of a binary file containing read-only
-# configuration data for EDK II platforms that contain Dynamic and DynamicEx PCDs described
-# in VPD sections. It also provide an option for specifying an alternate name for a mapping
-# file of PCD layout for use during the build when the platform integrator selects to use
+# This tool provide a simple process for the creation of a binary file containing read-only
+# configuration data for EDK II platforms that contain Dynamic and DynamicEx PCDs described
+# in VPD sections. It also provide an option for specifying an alternate name for a mapping
+# file of PCD layout for use during the build when the platform integrator selects to use
# automatic offset calculation.
#
# Copyright (c) 2010 - 2016, Intel Corporation. All rights reserved.<BR>
@@ -46,26 +46,26 @@ VERSION = (st.LBL_BPDG_VERSION + " Build " + gBUILD_VERSION)
#
def main():
global Options, Args
-
+
# Initialize log system
- EdkLogger.Initialize()
+ EdkLogger.Initialize()
Options, Args = MyOptionParser()
-
+
ReturnCode = 0
-
+
if Options.opt_verbose:
EdkLogger.SetLevel(EdkLogger.VERBOSE)
elif Options.opt_quiet:
EdkLogger.SetLevel(EdkLogger.QUIET)
elif Options.debug_level is not None:
- EdkLogger.SetLevel(Options.debug_level + 1)
+ EdkLogger.SetLevel(Options.debug_level + 1)
else:
EdkLogger.SetLevel(EdkLogger.INFO)
-
+
if Options.bin_filename is None:
- EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE, "Please use the -o option to specify the file name for the VPD binary file")
+ EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE, "Please use the -o option to specify the file name for the VPD binary file")
if Options.filename is None:
- EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE, "Please use the -m option to specify the file name for the mapping file")
+ EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE, "Please use the -m option to specify the file name for the mapping file")
Force = False
if Options.opt_force is not None:
@@ -75,8 +75,8 @@ def main():
StartBpdg(Args[0], Options.filename, Options.bin_filename, Force)
else :
EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE, "Please specify the file which contain the VPD pcd info.",
- None)
-
+ None)
+
return ReturnCode
@@ -86,8 +86,8 @@ def main():
#
# @retval options A optparse.Values object containing the parsed options
# @retval args Target of BPDG command
-#
-def MyOptionParser():
+#
+def MyOptionParser():
#
# Process command line firstly.
#
@@ -105,10 +105,10 @@ def MyOptionParser():
parser.add_option('-o', '--vpd-filename', action='store', dest='bin_filename',
help=st.MSG_OPTION_VPD_FILENAME)
parser.add_option('-m', '--map-filename', action='store', dest='filename',
- help=st.MSG_OPTION_MAP_FILENAME)
+ help=st.MSG_OPTION_MAP_FILENAME)
parser.add_option('-f', '--force', action='store_true', dest='opt_force',
- help=st.MSG_OPTION_FORCE)
-
+ help=st.MSG_OPTION_FORCE)
+
(options, args) = parser.parse_args()
if len(args) == 0:
EdkLogger.info("Please specify the filename.txt file which contain the VPD pcd info!")
@@ -117,7 +117,7 @@ def MyOptionParser():
return options, args
-## Start BPDG and call the main functions
+## Start BPDG and call the main functions
#
# This method mainly focus on call GenVPD class member functions to complete
# BPDG's target. It will process VpdFile override, and provide the interface file
@@ -136,19 +136,19 @@ def StartBpdg(InputFileName, MapFileName, VpdFileName, Force):
choice = sys.stdin.readline()
if choice.strip().lower() not in ['y', 'yes', '']:
return
-
+
GenVPD = GenVpd.GenVPD (InputFileName, MapFileName, VpdFileName)
-
- EdkLogger.info('%-24s = %s' % ("VPD input data file: ", InputFileName))
+
+ EdkLogger.info('%-24s = %s' % ("VPD input data file: ", InputFileName))
EdkLogger.info('%-24s = %s' % ("VPD output map file: ", MapFileName))
- EdkLogger.info('%-24s = %s' % ("VPD output binary file: ", VpdFileName))
-
+ EdkLogger.info('%-24s = %s' % ("VPD output binary file: ", VpdFileName))
+
GenVPD.ParserInputFile()
GenVPD.FormatFileLine()
GenVPD.FixVpdOffset()
GenVPD.GenerateVpdFile(MapFileName, VpdFileName)
-
- EdkLogger.info("- Vpd pcd fixed done! -")
+
+ EdkLogger.info("- Vpd pcd fixed done! -")
if __name__ == '__main__':
r = main()
@@ -156,4 +156,4 @@ if __name__ == '__main__':
if r < 0 or r > 127: r = 1
sys.exit(r)
-
+
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 69a9665f5a76..f83d477c35f0 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -31,10 +31,10 @@ _FORMAT_CHAR = {1: 'B',
## The VPD PCD data structure for store and process each VPD PCD entry.
#
-# This class contain method to format and pack pcd's value.
+# This class contain method to format and pack pcd's value.
#
class PcdEntry:
- def __init__(self, PcdCName, SkuId,PcdOffset, PcdSize, PcdValue, Lineno=None, FileName=None, PcdUnpackValue=None,
+ def __init__(self, PcdCName, SkuId,PcdOffset, PcdSize, PcdValue, Lineno=None, FileName=None, PcdUnpackValue=None,
PcdBinOffset=None, PcdBinSize=None, Alignment=None):
self.PcdCName = PcdCName.strip()
self.SkuId = SkuId.strip()
@@ -47,7 +47,7 @@ class PcdEntry:
self.PcdBinOffset = PcdBinOffset
self.PcdBinSize = PcdBinSize
self.Alignment = Alignment
-
+
if self.PcdValue == '' :
EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
"Invalid PCD format(Name: %s File: %s line: %s) , no Value specified!" % (self.PcdCName, self.FileName, self.Lineno))
@@ -63,13 +63,13 @@ class PcdEntry:
self._GenOffsetValue ()
## Analyze the string value to judge the PCD's datum type equal to Boolean or not.
- #
+ #
# @param ValueString PCD's value
# @param Size PCD's size
- #
+ #
# @retval True PCD's datum type is Boolean
- # @retval False PCD's datum type is not Boolean.
- #
+ # @retval False PCD's datum type is not Boolean.
+ #
def _IsBoolean(self, ValueString, Size):
if (Size == "1"):
if ValueString.upper() in ["TRUE", "FALSE"]:
@@ -80,10 +80,10 @@ class PcdEntry:
return False
## Convert the PCD's value from string to integer.
- #
+ #
# This function will try to convert the Offset value form string to integer
# for both hexadecimal and decimal.
- #
+ #
def _GenOffsetValue(self):
if self.PcdOffset != "*" :
try:
@@ -96,10 +96,10 @@ class PcdEntry:
"Invalid offset value %s for PCD %s (File: %s Line: %s)" % (self.PcdOffset, self.PcdCName, self.FileName, self.Lineno))
## Pack Boolean type VPD PCD's value form string to binary type.
- #
+ #
# @param ValueString The boolean type string for pack.
- #
- #
+ #
+ #
def _PackBooleanValue(self, ValueString):
if ValueString.upper() == "TRUE" or ValueString in ["1", "0x1", "0x01"]:
try:
@@ -115,10 +115,10 @@ class PcdEntry:
"Invalid size or value for PCD %s to pack(File: %s Line: %s)." % (self.PcdCName, self.FileName, self.Lineno))
## Pack Integer type VPD PCD's value form string to binary type.
- #
+ #
# @param ValueString The Integer type string for pack.
- #
- #
+ #
+ #
def _PackIntValue(self, IntValue, Size):
if Size not in _FORMAT_CHAR:
EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
@@ -170,7 +170,7 @@ class PcdEntry:
# 3: {bytearray}, only support byte-array.
#
# @param ValueString The Integer type string for pack.
- #
+ #
def _PackPtrValue(self, ValueString, Size):
if ValueString.startswith('L"') or ValueString.startswith("L'"):
self._PackUnicode(ValueString, Size)
@@ -183,9 +183,9 @@ class PcdEntry:
"Invalid VOID* type PCD %s value %s (File: %s Line: %s)" % (self.PcdCName, ValueString, self.FileName, self.Lineno))
## Pack an Ascii PCD value.
- #
+ #
# An Ascii string for a PCD should be in format as ""/''.
- #
+ #
def _PackString(self, ValueString, Size):
if (Size < 0):
EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
@@ -198,7 +198,7 @@ class PcdEntry:
QuotedFlag = False
ValueString = ValueString[1:-1]
- # No null-terminator in 'string'
+ # No null-terminator in 'string'
if (QuotedFlag and len(ValueString) + 1 > Size) or (not QuotedFlag and len(ValueString) > Size):
EdkLogger.error("BPDG", BuildToolError.RESOURCE_OVERFLOW,
"PCD value string %s is exceed to size %d(File: %s Line: %s)" % (ValueString, Size, self.FileName, self.Lineno))
@@ -209,9 +209,9 @@ class PcdEntry:
"Invalid size or value for PCD %s to pack(File: %s Line: %s)." % (self.PcdCName, self.FileName, self.Lineno))
## Pack a byte-array PCD value.
- #
+ #
# A byte-array for a PCD should be in format as {0x01, 0x02, ...}.
- #
+ #
def _PackByteArray(self, ValueString, Size):
if (Size < 0):
EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid parameter Size %s of PCD %s!(File: %s Line: %s)" % (self.PcdBinSize, self.PcdCName, self.FileName, self.Lineno))
@@ -261,7 +261,7 @@ class PcdEntry:
self.PcdValue = ReturnArray.tolist()
## Pack a unicode PCD value into byte array.
- #
+ #
# A unicode string for a PCD should be in format as L""/L''.
#
def _PackUnicode(self, UnicodeString, Size):
@@ -271,7 +271,7 @@ class PcdEntry:
QuotedFlag = True
if UnicodeString.startswith("L'"):
- QuotedFlag = False
+ QuotedFlag = False
UnicodeString = UnicodeString[2:-1]
# No null-terminator in L'string'
@@ -304,7 +304,7 @@ class PcdEntry:
# 2. Format the input file data to remove unused lines;
# 3. Fixed offset if needed;
# 4. Generate output file, including guided.map and guided.bin file;
-#
+#
class GenVPD :
## Constructor of DscBuildData
#
@@ -334,9 +334,9 @@ class GenVPD :
EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % InputFileName, None)
##
- # Parser the input file which is generated by the build tool. Convert the value of each pcd's
+ # Parser the input file which is generated by the build tool. Convert the value of each pcd's
# from string to it's real format. Also remove the useless line in the input file.
- #
+ #
def ParserInputFile (self):
count = 0
for line in self.FileLinesList:
@@ -390,7 +390,7 @@ class GenVPD :
#
# After remove the useless line, if there are no data remain in the file line list,
# Report warning messages to user's.
- #
+ #
if len(self.FileLinesList) == 0 :
EdkLogger.warn('BPDG', BuildToolError.RESOURCE_NOT_AVAILABLE,
"There are no VPD type pcds defined in DSC file, Please check it.")
@@ -399,7 +399,7 @@ class GenVPD :
count = 0
for line in self.FileLinesList:
if line is not None :
- PCD = PcdEntry(line[0], line[1], line[2], line[3], line[4],line[5], self.InputFileName)
+ PCD = PcdEntry(line[0], line[1], line[2], line[3], line[4],line[5], self.InputFileName)
# Strip the space char
PCD.PcdCName = PCD.PcdCName.strip(' ')
PCD.SkuId = PCD.SkuId.strip(' ')
@@ -480,14 +480,14 @@ class GenVPD :
continue
##
- # This function used to create a clean list only contain useful information and reorganized to make it
+ # This function used to create a clean list only contain useful information and reorganized to make it
# easy to be sorted
#
def FormatFileLine (self) :
for eachPcd in self.FileLinesList :
if eachPcd.PcdOffset != '*' :
- # Use pcd's Offset value as key, and pcd's Value as value
+ # Use pcd's Offset value as key, and pcd's Value as value
self.PcdFixedOffsetSizeList.append(eachPcd)
else :
# Use pcd's CName as key, and pcd's Size as value
@@ -497,11 +497,11 @@ class GenVPD :
##
# This function is use to fix the offset value which the not specified in the map file.
# Usually it use the star (meaning any offset) character in the offset field
- #
+ #
def FixVpdOffset (self):
# At first, the offset should start at 0
# Sort fixed offset list in order to find out where has free spaces for the pcd's offset
- # value is "*" to insert into.
+ # value is "*" to insert into.
self.PcdFixedOffsetSizeList.sort(lambda x, y: cmp(x.PcdBinOffset, y.PcdBinOffset))
@@ -530,57 +530,57 @@ class GenVPD :
Pcd.PcdBinOffset = NowOffset
Pcd.PcdOffset = str(hex(Pcd.PcdBinOffset))
NowOffset += Pcd.PcdOccupySize
-
+
self.PcdFixedOffsetSizeList = self.PcdUnknownOffsetList
return
- # Check the offset of VPD type pcd's offset start from 0.
+ # Check the offset of VPD type pcd's offset start from 0.
if self.PcdFixedOffsetSizeList[0].PcdBinOffset != 0 :
EdkLogger.warn("BPDG", "The offset of VPD type pcd should start with 0, please check it.",
None)
# Judge whether the offset in fixed pcd offset list is overlapped or not.
lenOfList = len(self.PcdFixedOffsetSizeList)
- count = 0
+ count = 0
while (count < lenOfList - 1) :
PcdNow = self.PcdFixedOffsetSizeList[count]
PcdNext = self.PcdFixedOffsetSizeList[count+1]
- # Two pcd's offset is same
+ # Two pcd's offset is same
if PcdNow.PcdBinOffset == PcdNext.PcdBinOffset :
EdkLogger.error("BPDG", BuildToolError.ATTRIBUTE_GET_FAILURE,
"The offset of %s at line: %s is same with %s at line: %s in file %s" % \
(PcdNow.PcdCName, PcdNow.Lineno, PcdNext.PcdCName, PcdNext.Lineno, PcdNext.FileName),
None)
- # Overlapped
+ # Overlapped
if PcdNow.PcdBinOffset + PcdNow.PcdOccupySize > PcdNext.PcdBinOffset :
EdkLogger.error("BPDG", BuildToolError.ATTRIBUTE_GET_FAILURE,
"The offset of %s at line: %s is overlapped with %s at line: %s in file %s" % \
(PcdNow.PcdCName, PcdNow.Lineno, PcdNext.PcdCName, PcdNext.Lineno, PcdNext.FileName),
None)
- # Has free space, raise a warning message
+ # Has free space, raise a warning message
if PcdNow.PcdBinOffset + PcdNow.PcdOccupySize < PcdNext.PcdBinOffset :
EdkLogger.warn("BPDG", BuildToolError.ATTRIBUTE_GET_FAILURE,
"The offsets have free space of between %s at line: %s and %s at line: %s in file %s" % \
(PcdNow.PcdCName, PcdNow.Lineno, PcdNext.PcdCName, PcdNext.Lineno, PcdNext.FileName),
None)
count += 1
-
+
LastOffset = self.PcdFixedOffsetSizeList[0].PcdBinOffset
FixOffsetSizeListCount = 0
lenOfList = len(self.PcdFixedOffsetSizeList)
lenOfUnfixedList = len(self.PcdUnknownOffsetList)
-
+
##
- # Insert the un-fixed offset pcd's list into fixed offset pcd's list if has free space between those pcds.
- #
+ # Insert the un-fixed offset pcd's list into fixed offset pcd's list if has free space between those pcds.
+ #
while (FixOffsetSizeListCount < lenOfList) :
-
- eachFixedPcd = self.PcdFixedOffsetSizeList[FixOffsetSizeListCount]
+
+ eachFixedPcd = self.PcdFixedOffsetSizeList[FixOffsetSizeListCount]
NowOffset = eachFixedPcd.PcdBinOffset
-
- # Has free space
+
+ # Has free space
if LastOffset < NowOffset :
if lenOfUnfixedList != 0 :
countOfUnfixedList = 0
@@ -598,42 +598,42 @@ class GenVPD :
eachUnfixedPcd.PcdBinOffset = LastOffset
# Insert this pcd into fixed offset pcd list.
self.PcdFixedOffsetSizeList.insert(FixOffsetSizeListCount,eachUnfixedPcd)
-
+
# Delete the item's offset that has been fixed and added into fixed offset list
self.PcdUnknownOffsetList.pop(countOfUnfixedList)
-
+
# After item added, should enlarge the length of fixed pcd offset list
- lenOfList += 1
+ lenOfList += 1
FixOffsetSizeListCount += 1
-
+
# Decrease the un-fixed pcd offset list's length
lenOfUnfixedList -= 1
-
- # Modify the last offset value
- LastOffset += needFixPcdSize
+
+ # Modify the last offset value
+ LastOffset += needFixPcdSize
else :
# It can not insert into those two pcds, need to check still has other space can store it.
LastOffset = NowOffset + self.PcdFixedOffsetSizeList[FixOffsetSizeListCount].PcdOccupySize
FixOffsetSizeListCount += 1
break
-
+
# Set the FixOffsetSizeListCount = lenOfList for quit the loop
else :
- FixOffsetSizeListCount = lenOfList
-
- # No free space, smoothly connect with previous pcd.
+ FixOffsetSizeListCount = lenOfList
+
+ # No free space, smoothly connect with previous pcd.
elif LastOffset == NowOffset :
LastOffset = NowOffset + eachFixedPcd.PcdOccupySize
FixOffsetSizeListCount += 1
- # Usually it will not enter into this thunk, if so, means it overlapped.
+ # Usually it will not enter into this thunk, if so, means it overlapped.
else :
EdkLogger.error("BPDG", BuildToolError.ATTRIBUTE_NOT_AVAILABLE,
"The offset value definition has overlapped at pcd: %s, it's offset is: %s, in file: %s line: %s" % \
(eachFixedPcd.PcdCName, eachFixedPcd.PcdOffset, eachFixedPcd.InputFileName, eachFixedPcd.Lineno),
None)
FixOffsetSizeListCount += 1
-
- # Continue to process the un-fixed offset pcd's list, add this time, just append them behind the fixed pcd's offset list.
+
+ # Continue to process the un-fixed offset pcd's list, add this time, just append them behind the fixed pcd's offset list.
lenOfUnfixedList = len(self.PcdUnknownOffsetList)
lenOfList = len(self.PcdFixedOffsetSizeList)
while (lenOfUnfixedList > 0) :
@@ -641,23 +641,23 @@ class GenVPD :
# The last pcd instance
LastPcd = self.PcdFixedOffsetSizeList[lenOfList-1]
NeedFixPcd = self.PcdUnknownOffsetList[0]
-
+
NeedFixPcd.PcdBinOffset = LastPcd.PcdBinOffset + LastPcd.PcdOccupySize
if NeedFixPcd.PcdBinOffset % NeedFixPcd.Alignment != 0:
NeedFixPcd.PcdBinOffset = (NeedFixPcd.PcdBinOffset / NeedFixPcd.Alignment + 1) * NeedFixPcd.Alignment
NeedFixPcd.PcdOffset = str(hex(NeedFixPcd.PcdBinOffset))
-
+
# Insert this pcd into fixed offset pcd list's tail.
self.PcdFixedOffsetSizeList.insert(lenOfList, NeedFixPcd)
# Delete the item's offset that has been fixed and added into fixed offset list
self.PcdUnknownOffsetList.pop(0)
-
+
lenOfList += 1
- lenOfUnfixedList -= 1
+ lenOfUnfixedList -= 1
##
# Write the final data into output files.
- #
+ #
def GenerateVpdFile (self, MapFileName, BinFileName):
#Open an VPD file to process
@@ -705,4 +705,4 @@ class GenVPD :
fStringIO.close ()
fVpdFile.close ()
fMapFile.close ()
-
+
diff --git a/BaseTools/Source/Python/BPDG/StringTable.py b/BaseTools/Source/Python/BPDG/StringTable.py
index bbcb45119868..79acefaf0a94 100644
--- a/BaseTools/Source/Python/BPDG/StringTable.py
+++ b/BaseTools/Source/Python/BPDG/StringTable.py
@@ -31,7 +31,7 @@ MAP_FILE_COMMENT_TEMPLATE = \
# THIS IS AUTO-GENERATED FILE BY BPDG TOOLS AND PLEASE DO NOT MAKE MODIFICATION.
#
# This file lists all VPD informations for a platform fixed/adjusted by BPDG tool.
-#
+#
# Copyright (c) 2010 -2016, Intel Corporation. All rights reserved.<BR>
# This program and the accompanying materials
# are licensed and made available under the terms and conditions of the BSD License
@@ -53,15 +53,15 @@ LBL_BPDG_USAGE = \
Copyright (c) 2010 - 2016, Intel Corporation All Rights Reserved.
Intel(r) Binary Product Data Generation Tool (Intel(r) BPDG)
-
+
Required Flags:
-o BIN_FILENAME, --vpd-filename=BIN_FILENAME
Specify the file name for the VPD binary file
-m FILENAME, --map-filename=FILENAME
- Generate file name for consumption during the build that contains
- the mapping of Pcd name, offset, datum size and value derived
+ Generate file name for consumption during the build that contains
+ the mapping of Pcd name, offset, datum size and value derived
from the input file and any automatic calculations.
-"""
+"""
)
MSG_OPTION_HELP = ("Show this help message and exit.")
diff --git a/BaseTools/Source/Python/Common/BuildVersion.py b/BaseTools/Source/Python/Common/BuildVersion.py
index 7414d30f49ea..6dda750dc687 100644
--- a/BaseTools/Source/Python/Common/BuildVersion.py
+++ b/BaseTools/Source/Python/Common/BuildVersion.py
@@ -4,9 +4,9 @@
#
# Copyright (c) 2011, Intel Corporation. All rights reserved.<BR>
#
-# This program and the accompanying materials are licensed and made available
-# under the terms and conditions of the BSD License which accompanies this
-# distribution. The full text of the license may be found at
+# This program and the accompanying materials are licensed and made available
+# under the terms and conditions of the BSD License which accompanies this
+# distribution. The full text of the license may be found at
# http://opensource.org/licenses/bsd-license.php
#
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
diff --git a/BaseTools/Source/Python/Common/Database.py b/BaseTools/Source/Python/Common/Database.py
index a81a44731f03..ca1859a4b912 100644
--- a/BaseTools/Source/Python/Common/Database.py
+++ b/BaseTools/Source/Python/Common/Database.py
@@ -33,7 +33,7 @@ from Table.TableDsc import TableDsc
# This class defined the build databse
# During the phase of initialization, the database will create all tables and
# insert all records of table DataModel
-#
+#
# @param object: Inherited from object class
# @param DbPath: A string for the path of the ECC database
#
@@ -54,7 +54,7 @@ class Database(object):
self.TblInf = TableInf(self.Cur)
self.TblDec = TableDec(self.Cur)
self.TblDsc = TableDsc(self.Cur)
-
+
## Initialize build database
#
# 1. Delete all old existing tables
@@ -69,7 +69,7 @@ class Database(object):
# self.TblDataModel.Drop()
# self.TblDsc.Drop()
# self.TblFile.Drop()
-
+
#
# Create new tables
#
@@ -78,7 +78,7 @@ class Database(object):
self.TblInf.Create()
self.TblDec.Create()
self.TblDsc.Create()
-
+
#
# Initialize table DataModel
#
@@ -91,10 +91,10 @@ class Database(object):
#
def QueryTable(self, Table):
Table.Query()
-
+
## Close entire database
#
- # Commit all first
+ # Commit all first
# Close the connection and cursor
#
def Close(self):
@@ -110,11 +110,10 @@ class Database(object):
if __name__ == '__main__':
EdkLogger.Initialize()
EdkLogger.SetLevel(EdkLogger.DEBUG_0)
-
+
Db = Database(DATABASE_PATH)
Db.InitDatabase()
- Db.QueryTable(Db.TblDataModel)
+ Db.QueryTable(Db.TblDataModel)
Db.QueryTable(Db.TblFile)
Db.QueryTable(Db.TblDsc)
Db.Close()
-
\ No newline at end of file
diff --git a/BaseTools/Source/Python/Common/MigrationUtilities.py b/BaseTools/Source/Python/Common/MigrationUtilities.py
index e9f1cabcb794..27d30a11b529 100644
--- a/BaseTools/Source/Python/Common/MigrationUtilities.py
+++ b/BaseTools/Source/Python/Common/MigrationUtilities.py
@@ -36,10 +36,10 @@ def SetCommon(Common, XmlCommon):
XmlTag = "FeatureFlag"
Common.FeatureFlag = XmlAttribute(XmlCommon, XmlTag)
-
+
XmlTag = "SupArchList"
Common.SupArchList = XmlAttribute(XmlCommon, XmlTag).split()
-
+
XmlTag = XmlNodeName(XmlCommon) + "/" + "HelpText"
Common.HelpText = XmlElement(XmlCommon, XmlTag)
@@ -56,7 +56,7 @@ def SetCommon(Common, XmlCommon):
#
def SetIdentification(CommonHeader, XmlCommonHeader, NameTag, FileName):
XmlParentTag = XmlNodeName(XmlCommonHeader)
-
+
XmlTag = XmlParentTag + "/" + NameTag
CommonHeader.Name = XmlElement(XmlCommonHeader, XmlTag)
@@ -102,7 +102,7 @@ def AddToSpecificationDict(SpecificationDict, SpecificationString):
def SetCommonHeader(CommonHeader, XmlCommonHeader):
"""Set all attributes of CommonHeaderClass object from XmlCommonHeader"""
XmlParent = XmlNodeName(XmlCommonHeader)
-
+
XmlTag = XmlParent + "/" + "Abstract"
CommonHeader.Abstract = XmlElement(XmlCommonHeader, XmlTag)
@@ -144,16 +144,16 @@ def LoadClonedRecord(XmlCloned):
XmlTag = "Cloned/PackageGuid"
ClonedRecord.PackageGuid = XmlElement(XmlCloned, XmlTag)
-
+
XmlTag = "Cloned/PackageVersion"
ClonedRecord.PackageVersion = XmlElement(XmlCloned, XmlTag)
-
+
XmlTag = "Cloned/ModuleGuid"
ClonedRecord.ModuleGuid = XmlElement(XmlCloned, XmlTag)
-
+
XmlTag = "Cloned/ModuleVersion"
ClonedRecord.ModuleVersion = XmlElement(XmlCloned, XmlTag)
-
+
return ClonedRecord
@@ -169,7 +169,7 @@ def LoadClonedRecord(XmlCloned):
#
def LoadGuidProtocolPpiCommon(XmlGuidProtocolPpiCommon):
GuidProtocolPpiCommon = GuidProtocolPpiCommonClass()
-
+
XmlTag = "Name"
GuidProtocolPpiCommon.Name = XmlAttribute(XmlGuidProtocolPpiCommon, XmlTag)
@@ -180,19 +180,19 @@ def LoadGuidProtocolPpiCommon(XmlGuidProtocolPpiCommon):
XmlTag = "%s/GuidCName" % XmlParent
else:
XmlTag = "%s/%sCName" % (XmlParent, XmlParent)
-
+
GuidProtocolPpiCommon.CName = XmlElement(XmlGuidProtocolPpiCommon, XmlTag)
-
+
XmlTag = XmlParent + "/" + "GuidValue"
GuidProtocolPpiCommon.Guid = XmlElement(XmlGuidProtocolPpiCommon, XmlTag)
-
+
if XmlParent.endswith("Notify"):
GuidProtocolPpiCommon.Notify = True
XmlTag = "GuidTypeList"
GuidTypes = XmlAttribute(XmlGuidProtocolPpiCommon, XmlTag)
GuidProtocolPpiCommon.GuidTypeList = GuidTypes.split()
-
+
XmlTag = "SupModuleList"
SupModules = XmlAttribute(XmlGuidProtocolPpiCommon, XmlTag)
GuidProtocolPpiCommon.SupModuleList = SupModules.split()
@@ -264,24 +264,24 @@ def LoadLibraryClass(XmlLibraryClass):
if LibraryClass.LibraryClass == "":
XmlTag = "Name"
LibraryClass.LibraryClass = XmlAttribute(XmlLibraryClass, XmlTag)
-
+
XmlTag = "LibraryClass/IncludeHeader"
LibraryClass.IncludeHeader = XmlElement(XmlLibraryClass, XmlTag)
-
+
XmlTag = "RecommendedInstanceVersion"
RecommendedInstanceVersion = XmlAttribute(XmlLibraryClass, XmlTag)
LibraryClass.RecommendedInstanceVersion = RecommendedInstanceVersion
-
+
XmlTag = "RecommendedInstanceGuid"
RecommendedInstanceGuid = XmlAttribute(XmlLibraryClass, XmlTag)
LibraryClass.RecommendedInstanceGuid = RecommendedInstanceGuid
-
+
XmlTag = "SupModuleList"
SupModules = XmlAttribute(XmlLibraryClass, XmlTag)
LibraryClass.SupModuleList = SupModules.split()
-
+
SetCommon(LibraryClass, XmlLibraryClass)
-
+
return LibraryClass
@@ -297,24 +297,24 @@ def LoadLibraryClass(XmlLibraryClass):
def LoadBuildOption(XmlBuildOption):
"""Return a new BuildOptionClass object equivalent to XmlBuildOption"""
BuildOption = BuildOptionClass()
-
+
BuildOption.Option = XmlElementData(XmlBuildOption)
XmlTag = "BuildTargets"
BuildOption.BuildTargetList = XmlAttribute(XmlBuildOption, XmlTag).split()
-
+
XmlTag = "ToolChainFamily"
BuildOption.ToolChainFamily = XmlAttribute(XmlBuildOption, XmlTag)
-
+
XmlTag = "TagName"
BuildOption.TagName = XmlAttribute(XmlBuildOption, XmlTag)
-
+
XmlTag = "ToolCode"
BuildOption.ToolCode = XmlAttribute(XmlBuildOption, XmlTag)
-
+
XmlTag = "SupArchList"
BuildOption.SupArchList = XmlAttribute(XmlBuildOption, XmlTag).split()
-
+
return BuildOption
@@ -330,15 +330,15 @@ def LoadBuildOption(XmlBuildOption):
#
def LoadUserExtensions(XmlUserExtensions):
UserExtensions = UserExtensionsClass()
-
+
XmlTag = "UserID"
UserExtensions.UserID = XmlAttribute(XmlUserExtensions, XmlTag)
-
+
XmlTag = "Identifier"
UserExtensions.Identifier = XmlAttribute(XmlUserExtensions, XmlTag)
-
+
UserExtensions.Content = XmlElementData(XmlUserExtensions)
-
+
return UserExtensions
@@ -490,7 +490,7 @@ def GetTextFileInfo(FileName, TagTuple):
ValueTuple[Index] = Value
except:
EdkLogger.info("IO Error in reading file %s" % FileName)
-
+
return ValueTuple
@@ -524,7 +524,7 @@ def MigrationOptionParser(Source, Destinate, ToolName, VersionNumber=1.0):
UsageString = "%s [-a] [-v|-q] [-o <output_file>] <input_file>" % ToolName
Version = "%s Version %.2f" % (ToolName, VersionNumber)
Copyright = "Copyright (c) 2007, Intel Corporation. All rights reserved."
-
+
Parser = OptionParser(description=Copyright, version=Version, usage=UsageString)
Parser.add_option("-o", "--output", dest="OutputFile", help="The name of the %s file to be created." % Destinate)
Parser.add_option("-a", "--auto", dest="AutoWrite", action="store_true", default=False, help="Automatically create the %s file using the name of the %s file and replacing file extension" % (Source, Destinate))
@@ -540,7 +540,7 @@ def MigrationOptionParser(Source, Destinate, ToolName, VersionNumber=1.0):
EdkLogger.setLevel(EdkLogger.QUIET)
else:
EdkLogger.setLevel(EdkLogger.INFO)
-
+
# error check
if len(Args) == 0:
raise MigrationError(PARAMETER_MISSING, name="Input file", usage=Parser.get_usage())
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index f05ae39ebb29..f6ebaa60e23f 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -56,11 +56,11 @@ gFileTimeStampCache = {} # {file path : file time stamp}
gDependencyDatabase = {} # arch : {file path : [dependent files list]}
def GetVariableOffset(mapfilepath, efifilepath, varnames):
- """ Parse map file to get variable offset in current EFI file
+ """ Parse map file to get variable offset in current EFI file
@param mapfilepath Map file absolution path
@param efifilepath: EFI binary file full path
@param varnames iteratable container whose elements are variable names to be searched
-
+
@return List whos elements are tuple with variable name and raw offset
"""
lines = []
@@ -70,7 +70,7 @@ def GetVariableOffset(mapfilepath, efifilepath, varnames):
f.close()
except:
return None
-
+
if len(lines) == 0: return None
firstline = lines[0].strip()
if (firstline.startswith("Archive member included ") and
@@ -170,7 +170,7 @@ def _parseGeneral(lines, efifilepath, varnames):
continue
if line.startswith("entry point at"):
status = 3
- continue
+ continue
if status == 1 and len(line) != 0:
m = secReGeneral.match(line)
assert m is not None, "Fail to parse the section in map file , line is %s" % line
@@ -250,7 +250,7 @@ def ProcessDuplicatedInf(Path, BaseName, Workspace):
#
# A temporary INF is copied to database path which must have write permission
# The temporary will be removed at the end of build
- # In case of name conflict, the file name is
+ # In case of name conflict, the file name is
# FILE_GUIDBaseName (0D1B936F-68F3-4589-AFCC-FB8B7AEBC836module.inf)
#
TempFullPath = os.path.join(DbDir,
@@ -261,7 +261,7 @@ def ProcessDuplicatedInf(Path, BaseName, Workspace):
#
# To build same module more than once, the module path with FILE_GUID overridden has
# the file name FILE_GUIDmodule.inf, but the relative path (self.MetaFile.File) is the real path
- # in DSC which is used as relative path by C files and other files in INF.
+ # in DSC which is used as relative path by C files and other files in INF.
# A trick was used: all module paths are PathClass instances, after the initialization
# of PathClass, the PathClass.Path is overridden by the temporary INF path.
#
@@ -1538,29 +1538,29 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
# Used to avoid split issue while the value string contain "|" character
#
# @param[in] Setting: A String contain value/datum type/token number information;
-#
-# @retval ValueList: A List contain value, datum type and toke number.
+#
+# @retval ValueList: A List contain value, datum type and toke number.
#
def AnalyzePcdData(Setting):
ValueList = ['', '', '']
ValueRe = re.compile(r'^\s*L?\".*\|.*\"')
PtrValue = ValueRe.findall(Setting)
-
+
ValueUpdateFlag = False
-
+
if len(PtrValue) >= 1:
Setting = re.sub(ValueRe, '', Setting)
ValueUpdateFlag = True
TokenList = Setting.split(TAB_VALUE_SPLIT)
ValueList[0:len(TokenList)] = TokenList
-
+
if ValueUpdateFlag:
ValueList[0] = PtrValue[0]
-
- return ValueList
-
+
+ return ValueList
+
## check format of PCD value against its the datum type
#
# For PCD value setting
@@ -1764,7 +1764,7 @@ class PathClass(object):
OtherKey = Other.Path
else:
OtherKey = str(Other)
-
+
SelfKey = self.Path
if SelfKey == OtherKey:
return 0
@@ -1902,7 +1902,7 @@ class PeImageClass():
def _ByteListToStr(self, ByteList):
String = ''
for index in range(len(ByteList)):
- if ByteList[index] == 0:
+ if ByteList[index] == 0:
break
String += chr(ByteList[index])
return String
@@ -1939,11 +1939,11 @@ class DefaultStore():
if sid == minid:
return name
class SkuClass():
-
+
DEFAULT = 0
SINGLE = 1
MULTIPLE =2
-
+
def __init__(self,SkuIdentifier='', SkuIds=None):
if SkuIds is None:
SkuIds = {}
@@ -1955,7 +1955,7 @@ class SkuClass():
EdkLogger.error("build", PARAMETER_INVALID,
ExtraData = "SKU-ID [%s] value %s exceeds the max value of UINT64"
% (SkuName, SkuId))
-
+
self.AvailableSkuIds = sdict()
self.SkuIdSet = []
self.SkuIdNumberSet = []
@@ -1969,10 +1969,10 @@ class SkuClass():
self.SkuIdSet = SkuIds.keys()
self.SkuIdNumberSet = [num[0].strip() + 'U' for num in SkuIds.values()]
else:
- r = SkuIdentifier.split('|')
+ r = SkuIdentifier.split('|')
self.SkuIdSet=[(r[k].strip()).upper() for k in range(len(r))]
k = None
- try:
+ try:
self.SkuIdNumberSet = [SkuIds[k][0].strip() + 'U' for k in self.SkuIdSet]
except Exception:
EdkLogger.error("build", PARAMETER_INVALID,
@@ -2021,7 +2021,7 @@ class SkuClass():
skuorderset = []
for skuname in self.SkuIdSet:
skuorderset.append(self.GetSkuChain(skuname))
-
+
skuorder = []
for index in range(max([len(item) for item in skuorderset])):
for subset in skuorderset:
@@ -2033,8 +2033,8 @@ class SkuClass():
return skuorder
- def __SkuUsageType(self):
-
+ def __SkuUsageType(self):
+
if self.__SkuIdentifier.upper() == "ALL":
return SkuClass.MULTIPLE
@@ -2067,7 +2067,7 @@ class SkuClass():
return ArrayStr
def __GetAvailableSkuIds(self):
return self.AvailableSkuIds
-
+
def __GetSystemSkuID(self):
if self.__SkuUsageType() == SkuClass.SINGLE:
if len(self.SkuIdSet) == 1:
diff --git a/BaseTools/Source/Python/Common/MultipleWorkspace.py b/BaseTools/Source/Python/Common/MultipleWorkspace.py
index 2a76d49cc627..a80f22ade7da 100644
--- a/BaseTools/Source/Python/Common/MultipleWorkspace.py
+++ b/BaseTools/Source/Python/Common/MultipleWorkspace.py
@@ -20,16 +20,16 @@ from Common.DataType import TAB_WORKSPACE
## MultipleWorkspace
#
# This class manage multiple workspace behavior
-#
+#
# @param class:
#
# @var WORKSPACE: defined the current WORKSPACE
# @var PACKAGES_PATH: defined the other WORKSAPCE, if current WORKSPACE is invalid, search valid WORKSPACE from PACKAGES_PATH
-#
+#
class MultipleWorkspace(object):
WORKSPACE = ''
PACKAGES_PATH = None
-
+
## convertPackagePath()
#
# Convert path to match workspace.
@@ -59,7 +59,7 @@ class MultipleWorkspace(object):
cls.PACKAGES_PATH = [cls.convertPackagePath (Ws, os.path.normpath(Path.strip())) for Path in PackagesPath.split(os.pathsep)]
else:
cls.PACKAGES_PATH = []
-
+
## join()
#
# rewrite os.path.join function
@@ -79,7 +79,7 @@ class MultipleWorkspace(object):
return Path
Path = os.path.join(Ws, *p)
return Path
-
+
## relpath()
#
# rewrite os.path.relpath function
@@ -98,7 +98,7 @@ class MultipleWorkspace(object):
if Path.lower().startswith(Ws.lower()):
Path = os.path.relpath(Path, Ws)
return Path
-
+
## getWs()
#
# get valid workspace for the path
@@ -117,7 +117,7 @@ class MultipleWorkspace(object):
if os.path.exists(absPath):
return Pkg
return Ws
-
+
## handleWsMacro()
#
# handle the $(WORKSPACE) tag, if current workspace is invalid path relative the tool, replace it.
@@ -143,7 +143,7 @@ class MultipleWorkspace(object):
PathList[i] = str[0:MacroStartPos] + Path
PathStr = ' '.join(PathList)
return PathStr
-
+
## getPkgPath()
#
# get all package pathes.
@@ -153,4 +153,3 @@ class MultipleWorkspace(object):
@classmethod
def getPkgPath(cls):
return cls.PACKAGES_PATH
-
\ No newline at end of file
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 4d07bd752330..35b35e4893bc 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -42,7 +42,7 @@ ERR_IN_OPERAND = 'Macro after IN operator can only be: $(FAMILY), $(ARCH), $(TOO
class RangeObject(object):
def __init__(self, start, end, empty = False):
-
+
if int(start) < int(end):
self.start = int(start)
self.end = int(end)
@@ -54,24 +54,24 @@ class RangeObject(object):
class RangeContainer(object):
def __init__(self):
self.rangelist = []
-
+
def push(self, RangeObject):
self.rangelist.append(RangeObject)
self.rangelist = sorted(self.rangelist, key = lambda rangeobj : rangeobj.start)
self.merge()
-
+
def pop(self):
for item in self.rangelist:
yield item
-
- def __clean__(self):
+
+ def __clean__(self):
newrangelist = []
for rangeobj in self.rangelist:
if rangeobj.empty == True:
continue
else:
newrangelist.append(rangeobj)
- self.rangelist = newrangelist
+ self.rangelist = newrangelist
def merge(self):
self.__clean__()
for i in range(0, len(self.rangelist) - 1):
@@ -79,23 +79,23 @@ class RangeContainer(object):
continue
else:
self.rangelist[i + 1].start = self.rangelist[i].start
- self.rangelist[i + 1].end = self.rangelist[i + 1].end > self.rangelist[i].end and self.rangelist[i + 1].end or self.rangelist[i].end
+ self.rangelist[i + 1].end = self.rangelist[i + 1].end > self.rangelist[i].end and self.rangelist[i + 1].end or self.rangelist[i].end
self.rangelist[i].empty = True
self.__clean__()
-
+
def dump(self):
print "----------------------"
rangelist = ""
for object in self.rangelist:
rangelist = rangelist + "[%d , %d]" % (object.start, object.end)
print rangelist
-
-
-class XOROperatorObject(object):
- def __init__(self):
+
+
+class XOROperatorObject(object):
+ def __init__(self):
pass
- def Calculate(self, Operand, DataType, SymbolTable):
+ def Calculate(self, Operand, DataType, SymbolTable):
if type(Operand) == type('') and not Operand.isalnum():
Expr = "XOR ..."
raise BadExpression(ERR_SNYTAX % Expr)
@@ -107,9 +107,9 @@ class XOROperatorObject(object):
return rangeId
class LEOperatorObject(object):
- def __init__(self):
+ def __init__(self):
pass
- def Calculate(self, Operand, DataType, SymbolTable):
+ def Calculate(self, Operand, DataType, SymbolTable):
if type(Operand) == type('') and not Operand.isalnum():
Expr = "LE ..."
raise BadExpression(ERR_SNYTAX % Expr)
@@ -119,22 +119,22 @@ class LEOperatorObject(object):
SymbolTable[rangeId1] = rangeContainer
return rangeId1
class LTOperatorObject(object):
- def __init__(self):
+ def __init__(self):
pass
def Calculate(self, Operand, DataType, SymbolTable):
if type(Operand) == type('') and not Operand.isalnum():
- Expr = "LT ..."
- raise BadExpression(ERR_SNYTAX % Expr)
+ Expr = "LT ..."
+ raise BadExpression(ERR_SNYTAX % Expr)
rangeId1 = str(uuid.uuid1())
rangeContainer = RangeContainer()
rangeContainer.push(RangeObject(0, int(Operand) - 1))
SymbolTable[rangeId1] = rangeContainer
- return rangeId1
+ return rangeId1
class GEOperatorObject(object):
- def __init__(self):
+ def __init__(self):
pass
- def Calculate(self, Operand, DataType, SymbolTable):
+ def Calculate(self, Operand, DataType, SymbolTable):
if type(Operand) == type('') and not Operand.isalnum():
Expr = "GE ..."
raise BadExpression(ERR_SNYTAX % Expr)
@@ -142,12 +142,12 @@ class GEOperatorObject(object):
rangeContainer = RangeContainer()
rangeContainer.push(RangeObject(int(Operand), MAX_VAL_TYPE[DataType]))
SymbolTable[rangeId1] = rangeContainer
- return rangeId1
-
+ return rangeId1
+
class GTOperatorObject(object):
- def __init__(self):
+ def __init__(self):
pass
- def Calculate(self, Operand, DataType, SymbolTable):
+ def Calculate(self, Operand, DataType, SymbolTable):
if type(Operand) == type('') and not Operand.isalnum():
Expr = "GT ..."
raise BadExpression(ERR_SNYTAX % Expr)
@@ -155,12 +155,12 @@ class GTOperatorObject(object):
rangeContainer = RangeContainer()
rangeContainer.push(RangeObject(int(Operand) + 1, MAX_VAL_TYPE[DataType]))
SymbolTable[rangeId1] = rangeContainer
- return rangeId1
-
+ return rangeId1
+
class EQOperatorObject(object):
- def __init__(self):
+ def __init__(self):
pass
- def Calculate(self, Operand, DataType, SymbolTable):
+ def Calculate(self, Operand, DataType, SymbolTable):
if type(Operand) == type('') and not Operand.isalnum():
Expr = "EQ ..."
raise BadExpression(ERR_SNYTAX % Expr)
@@ -168,8 +168,8 @@ class EQOperatorObject(object):
rangeContainer = RangeContainer()
rangeContainer.push(RangeObject(int(Operand) , int(Operand)))
SymbolTable[rangeId1] = rangeContainer
- return rangeId1
-
+ return rangeId1
+
def GetOperatorObject(Operator):
if Operator == '>':
return GTOperatorObject()
@@ -213,8 +213,8 @@ class RangeExpression(object):
NumberDict[HexNumber] = Number
for HexNum in NumberDict:
expr = expr.replace(HexNum, NumberDict[HexNum])
-
- rangedict = {}
+
+ rangedict = {}
for validrange in self.RangePattern.findall(expr):
start, end = validrange.split(" - ")
start = start.strip()
@@ -224,19 +224,19 @@ class RangeExpression(object):
rangeContainer.push(RangeObject(start, end))
self.operanddict[str(rangeid)] = rangeContainer
rangedict[validrange] = str(rangeid)
-
+
for validrange in rangedict:
expr = expr.replace(validrange, rangedict[validrange])
-
- self._Expr = expr
+
+ self._Expr = expr
return expr
-
-
+
+
def EvalRange(self, Operator, Oprand):
operatorobj = GetOperatorObject(Operator)
return operatorobj.Calculate(Oprand, self.PcdDataType, self.operanddict)
-
+
def Rangeintersection(self, Oprand1, Oprand2):
rangeContainer1 = self.operanddict[Oprand1]
rangeContainer2 = self.operanddict[Oprand2]
@@ -265,35 +265,35 @@ class RangeExpression(object):
elif end1 >= end2:
rangeid = str(uuid.uuid1())
rangeContainer.push(RangeObject(start2, end2))
-
+
self.operanddict[rangeid] = rangeContainer
# rangeContainer.dump()
return rangeid
-
+
def Rangecollections(self, Oprand1, Oprand2):
rangeContainer1 = self.operanddict[Oprand1]
rangeContainer2 = self.operanddict[Oprand2]
rangeContainer = RangeContainer()
-
+
for rangeobj in rangeContainer2.pop():
rangeContainer.push(rangeobj)
for rangeobj in rangeContainer1.pop():
rangeContainer.push(rangeobj)
-
+
rangeid = str(uuid.uuid1())
self.operanddict[rangeid] = rangeContainer
-
+
# rangeContainer.dump()
return rangeid
-
-
+
+
def NegtiveRange(self, Oprand1):
rangeContainer1 = self.operanddict[Oprand1]
-
-
+
+
rangeids = []
-
+
for rangeobj in rangeContainer1.pop():
rangeContainer = RangeContainer()
rangeid = str(uuid.uuid1())
@@ -320,13 +320,13 @@ class RangeExpression(object):
re = self.Rangeintersection(rangeids[0], rangeids[1])
for i in range(2, len(rangeids)):
re = self.Rangeintersection(re, rangeids[i])
-
+
rangeid2 = str(uuid.uuid1())
self.operanddict[rangeid2] = self.operanddict[re]
return rangeid2
-
+
def Eval(self, Operator, Oprand1, Oprand2 = None):
-
+
if Operator in ["!", "NOT", "not"]:
if not gGuidPattern.match(Oprand1.strip()):
raise BadExpression(ERR_STRING_EXPR % Operator)
@@ -337,7 +337,7 @@ class RangeExpression(object):
elif Operator == 'and' :
if not gGuidPatternEnd.match(Oprand1.strip()) or not gGuidPatternEnd.match(Oprand2.strip()):
raise BadExpression(ERR_STRING_EXPR % Operator)
- return self.Rangeintersection(Oprand1, Oprand2)
+ return self.Rangeintersection(Oprand1, Oprand2)
elif Operator == 'or':
if not gGuidPatternEnd.match(Oprand1.strip()) or not gGuidPatternEnd.match(Oprand2.strip()):
raise BadExpression(ERR_STRING_EXPR % Operator)
@@ -367,11 +367,11 @@ class RangeExpression(object):
self._Len = len(self._Expr)
self._Token = ''
self._WarnExcept = None
-
+
# Literal token without any conversion
self._LiteralToken = ''
-
+
# store the operand object
self.operanddict = {}
# The Pcd max value depends on PcdDataType
@@ -391,9 +391,9 @@ class RangeExpression(object):
self._Depth = Depth
self._Expr = self._Expr.strip()
-
+
self.preProcessRangeExpr(self._Expr)
-
+
# check if the expression does not need to evaluate
if RealValue and Depth == 0:
self._Token = self._Expr
@@ -405,12 +405,12 @@ class RangeExpression(object):
Val = self._OrExpr()
RealVal = Val
-
+
RangeIdList = RealVal.split("or")
RangeList = []
for rangeid in RangeIdList:
RangeList.append(self.operanddict[rangeid.strip()])
-
+
return RangeList
# Template function to parse binary operators which have same precedence
@@ -706,10 +706,10 @@ class RangeExpression(object):
return False
-
-
-
-
+
+
+
+
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index ee26d7f7b1b0..389a3ca51d27 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -839,7 +839,7 @@ def StringToArray(String):
return "{%s,0x00}" % ",".join([ C.strip() for C in String[1:-1].split(',')])
else:
return "{%s}" % ",".join([ C.strip() for C in String[1:-1].split(',')])
-
+
else:
if len(String.split()) % 2:
return '{%s,0}' % ','.join(String.split())
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index 49b24ef780c7..83359586b994 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -92,7 +92,7 @@ class ToolDefClassObject(object):
KeyList = [TAB_TOD_DEFINES_TARGET, TAB_TOD_DEFINES_TOOL_CHAIN_TAG, TAB_TOD_DEFINES_TARGET_ARCH, TAB_TOD_DEFINES_COMMAND_TYPE]
for Index in range(3, -1, -1):
- # make a copy of the keys to enumerate over to prevent issues when
+ # make a copy of the keys to enumerate over to prevent issues when
# adding/removing items from the original dict.
for Key in list(self.ToolsDefTxtDictionary.keys()):
List = Key.split('_')
diff --git a/BaseTools/Source/Python/Common/VariableAttributes.py b/BaseTools/Source/Python/Common/VariableAttributes.py
index a2e22ca0409c..24d6f066fa3b 100644
--- a/BaseTools/Source/Python/Common/VariableAttributes.py
+++ b/BaseTools/Source/Python/Common/VariableAttributes.py
@@ -1,5 +1,5 @@
# # @file
-#
+#
# This file is used to handle the variable attributes and property information
#
#
@@ -12,7 +12,7 @@
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
-
+
class VariableAttributes(object):
EFI_VARIABLE_NON_VOLATILE = 0x00000001
EFI_VARIABLE_BOOTSERVICE_ACCESS = 0x00000002
@@ -24,22 +24,22 @@ class VariableAttributes(object):
"RT":EFI_VARIABLE_RUNTIME_ACCESS,
"RO":VAR_CHECK_VARIABLE_PROPERTY_READ_ONLY
}
-
+
def __init__(self):
pass
-
+
@staticmethod
def GetVarAttributes(var_attr_str):
VarAttr = 0x00000000
VarProp = 0x00000000
-
+
attr_list = var_attr_str.split(",")
for attr in attr_list:
attr = attr.strip()
if attr == 'RO':
VarProp = VariableAttributes.VAR_CHECK_VARIABLE_PROPERTY_READ_ONLY
else:
- VarAttr = VarAttr | VariableAttributes.VarAttributesMap.get(attr, 0x00000000)
+ VarAttr = VarAttr | VariableAttributes.VarAttributesMap.get(attr, 0x00000000)
return VarAttr, VarProp
@staticmethod
def ValidateVarAttributes(var_attr_str):
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 32895deb5d0c..c9fdbff20e0b 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -1,9 +1,9 @@
## @file
-#
+#
# This package manage the VPD PCD information file which will be generated
# by build tool's autogen.
# The VPD PCD information file will be input for third-party BPDG tool which
-# is pointed by *_*_*_VPD_TOOL_GUID in conf/tools_def.txt
+# is pointed by *_*_*_VPD_TOOL_GUID in conf/tools_def.txt
#
#
# Copyright (c) 2010 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -32,7 +32,7 @@ FILE_COMMENT_TEMPLATE = \
# THIS IS AUTO-GENERATED FILE BY BUILD TOOLS AND PLEASE DO NOT MAKE MODIFICATION.
#
# This file lists all VPD informations for a platform collected by build.exe.
-#
+#
# Copyright (c) 2010, Intel Corporation. All rights reserved.<BR>
# This program and the accompanying materials
# are licensed and made available under the terms and conditions of the BSD License
@@ -69,17 +69,17 @@ FILE_COMMENT_TEMPLATE = \
#
class VpdInfoFile:
- _rVpdPcdLine = None
+ _rVpdPcdLine = None
## Constructor
def __init__(self):
## Dictionary for VPD in following format
#
- # Key : PcdClassObject instance.
+ # Key : PcdClassObject instance.
# @see BuildClassObject.PcdClassObject
# Value : offset in different SKU such as [sku1_offset, sku2_offset]
self._VpdArray = {}
self._VpdInfo = {}
-
+
## Add a VPD PCD collected from platform's autogen when building.
#
# @param vpds The list of VPD PCD collected for a platform.
@@ -90,40 +90,40 @@ class VpdInfoFile:
def Add(self, Vpd, skuname,Offset):
if (Vpd is None):
EdkLogger.error("VpdInfoFile", BuildToolError.ATTRIBUTE_UNKNOWN_ERROR, "Invalid VPD PCD entry.")
-
+
if not (Offset >= 0 or Offset == "*"):
EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID, "Invalid offset parameter: %s." % Offset)
-
+
if Vpd.DatumType == TAB_VOID:
if Vpd.MaxDatumSize <= 0:
- EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
+ EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
"Invalid max datum size for VPD PCD %s.%s" % (Vpd.TokenSpaceGuidCName, Vpd.TokenCName))
- elif Vpd.DatumType in TAB_PCD_NUMERIC_TYPES:
+ elif Vpd.DatumType in TAB_PCD_NUMERIC_TYPES:
if not Vpd.MaxDatumSize:
Vpd.MaxDatumSize = MAX_SIZE_TYPE[Vpd.DatumType]
else:
if Vpd.MaxDatumSize <= 0:
EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
"Invalid max datum size for VPD PCD %s.%s" % (Vpd.TokenSpaceGuidCName, Vpd.TokenCName))
-
+
if Vpd not in self._VpdArray:
#
- # If there is no Vpd instance in dict, that imply this offset for a given SKU is a new one
+ # If there is no Vpd instance in dict, that imply this offset for a given SKU is a new one
#
self._VpdArray[Vpd] = {}
self._VpdArray[Vpd].update({skuname:Offset})
-
-
+
+
## Generate VPD PCD information into a text file
- #
+ #
# If parameter FilePath is invalid, then assert.
- # If
+ # If
# @param FilePath The given file path which would hold VPD information
def Write(self, FilePath):
if not (FilePath is not None or len(FilePath) != 0):
- EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
- "Invalid parameter FilePath: %s." % FilePath)
+ EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
+ "Invalid parameter FilePath: %s." % FilePath)
Content = FILE_COMMENT_TEMPLATE
Pcds = self._VpdArray.keys()
@@ -155,15 +155,15 @@ class VpdInfoFile:
try:
fd = open(FilePath, "r")
except:
- EdkLogger.error("VpdInfoFile",
- BuildToolError.FILE_OPEN_FAILURE,
+ EdkLogger.error("VpdInfoFile",
+ BuildToolError.FILE_OPEN_FAILURE,
"Fail to open file %s for written." % FilePath)
Lines = fd.readlines()
for Line in Lines:
Line = Line.strip()
if len(Line) == 0 or Line.startswith("#"):
continue
-
+
#
# the line must follow output format defined in BPDG spec.
#
@@ -173,9 +173,9 @@ class VpdInfoFile:
TokenSpaceName, PcdTokenName = PcdName.split(".")
except:
EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR, "Fail to parse VPD information file %s" % FilePath)
-
+
Found = False
-
+
if (TokenSpaceName, PcdTokenName) not in self._VpdInfo:
self._VpdInfo[(TokenSpaceName, PcdTokenName)] = []
self._VpdInfo[(TokenSpaceName, PcdTokenName)].append((SkuId,Offset, Value))
@@ -188,61 +188,61 @@ class VpdInfoFile:
if VpdObject.TokenSpaceGuidCName == TokenSpaceName and VpdObjectTokenCName == PcdTokenName.strip() and sku == SkuId:
if self._VpdArray[VpdObject][sku] == "*":
if Offset == "*":
- EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "The offset of %s has not been fixed up by third-party BPDG tool." % PcdName)
+ EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "The offset of %s has not been fixed up by third-party BPDG tool." % PcdName)
self._VpdArray[VpdObject][sku] = Offset
Found = True
if not Found:
EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR, "Can not find PCD defined in VPD guid file.")
-
+
## Get count of VPD PCD collected from platform's autogen when building.
#
- # @return The integer count value
+ # @return The integer count value
def GetCount(self):
Count = 0
for OffsetList in self._VpdArray.values():
Count += len(OffsetList)
-
+
return Count
-
+
## Get an offset value for a given VPD PCD
#
- # Because BPDG only support one Sku, so only return offset for SKU default.
+ # Because BPDG only support one Sku, so only return offset for SKU default.
#
- # @param vpd A given VPD PCD
+ # @param vpd A given VPD PCD
def GetOffset(self, vpd):
if not self._VpdArray.has_key(vpd):
return None
-
+
if len(self._VpdArray[vpd]) == 0:
return None
-
+
return self._VpdArray[vpd]
def GetVpdInfo(self,(PcdTokenName,TokenSpaceName)):
return self._VpdInfo.get((TokenSpaceName, PcdTokenName))
-
+
## Call external BPDG tool to process VPD file
-#
+#
# @param ToolPath The string path name for BPDG tool
# @param VpdFileName The string path name for VPD information guid.txt
-#
+#
def CallExtenalBPDGTool(ToolPath, VpdFileName):
assert ToolPath is not None, "Invalid parameter ToolPath"
assert VpdFileName is not None and os.path.exists(VpdFileName), "Invalid parameter VpdFileName"
-
+
OutputDir = os.path.dirname(VpdFileName)
FileName = os.path.basename(VpdFileName)
BaseName, ext = os.path.splitext(FileName)
OutputMapFileName = os.path.join(OutputDir, "%s.map" % BaseName)
OutputBinFileName = os.path.join(OutputDir, "%s.bin" % BaseName)
-
+
try:
PopenObject = subprocess.Popen(' '.join([ToolPath,
- '-o', OutputBinFileName,
+ '-o', OutputBinFileName,
'-m', OutputMapFileName,
'-q',
'-f',
VpdFileName]),
- stdout=subprocess.PIPE,
+ stdout=subprocess.PIPE,
stderr= subprocess.PIPE,
shell=True)
except Exception, X:
@@ -251,11 +251,11 @@ def CallExtenalBPDGTool(ToolPath, VpdFileName):
print out
while PopenObject.returncode is None :
PopenObject.wait()
-
+
if PopenObject.returncode != 0:
if PopenObject.returncode != 0:
EdkLogger.debug(EdkLogger.DEBUG_1, "Fail to call BPDG tool", str(error))
EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, "Fail to execute BPDG tool with exit code: %d, the error message is: \n %s" % \
(PopenObject.returncode, str(error)))
-
+
return PopenObject.returncode
diff --git a/BaseTools/Source/Python/CommonDataClass/FdfClass.py b/BaseTools/Source/Python/CommonDataClass/FdfClass.py
index 96a630f4d2cc..563a7c9ddbd9 100644
--- a/BaseTools/Source/Python/CommonDataClass/FdfClass.py
+++ b/BaseTools/Source/Python/CommonDataClass/FdfClass.py
@@ -83,7 +83,7 @@ class RegionClassObject:
## FFS data in FDF
#
-#
+#
class FfsClassObject:
## The constructor
#
@@ -98,7 +98,7 @@ class FfsClassObject:
## FILE statement data in FDF
#
-#
+#
class FileStatementClassObject (FfsClassObject) :
## The constructor
#
@@ -149,7 +149,7 @@ class AprioriSectionClassObject:
## section data in FDF
#
-#
+#
class SectionClassObject:
## The constructor
#
@@ -157,10 +157,10 @@ class SectionClassObject:
#
def __init__(self):
self.Alignment = None
-
+
## Depex expression section in FDF
#
-#
+#
class DepexSectionClassObject (SectionClassObject):
## The constructor
#
@@ -186,7 +186,7 @@ class CompressSectionClassObject (SectionClassObject) :
## Data section data in FDF
#
-#
+#
class DataSectionClassObject (SectionClassObject):
## The constructor
#
@@ -220,7 +220,7 @@ class EfiSectionClassObject (SectionClassObject):
## FV image section data in FDF
#
-#
+#
class FvImageSectionClassObject (SectionClassObject):
## The constructor
#
@@ -237,7 +237,7 @@ class FvImageSectionClassObject (SectionClassObject):
## GUIDed section data in FDF
#
-#
+#
class GuidSectionClassObject (SectionClassObject) :
## The constructor
#
@@ -270,7 +270,7 @@ class UiSectionClassObject (SectionClassObject):
## Version section data in FDF
#
-#
+#
class VerSectionClassObject (SectionClassObject):
## The constructor
#
@@ -305,7 +305,7 @@ class RuleClassObject :
## Complex rule data in FDF
#
-#
+#
class RuleComplexFileClassObject(RuleClassObject) :
## The constructor
#
@@ -343,7 +343,7 @@ class RuleFileExtensionClassObject(RuleClassObject):
## Capsule data in FDF
#
-#
+#
class CapsuleClassObject :
## The constructor
#
@@ -380,7 +380,7 @@ class VtfClassObject :
## VTF component data in FDF
#
-#
+#
class ComponentStatementClassObject :
## The constructor
#
@@ -396,7 +396,7 @@ class ComponentStatementClassObject :
self.CompSym = None
self.CompSize = None
self.FilePos = None
-
+
## OptionROM data in FDF
#
#
@@ -408,4 +408,4 @@ class OptionRomClassObject:
def __init__(self):
self.DriverName = None
self.FfsList = []
-
+
diff --git a/BaseTools/Source/Python/Ecc/CLexer.py b/BaseTools/Source/Python/Ecc/CLexer.py
index a496f4344030..c7956e8ddae6 100644
--- a/BaseTools/Source/Python/Ecc/CLexer.py
+++ b/BaseTools/Source/Python/Ecc/CLexer.py
@@ -2,7 +2,7 @@
from antlr3 import *
from antlr3.compat import set, frozenset
-
+
## @file
# The file defines the Lexer for C source files.
#
@@ -4341,7 +4341,7 @@ class CLexer(Lexer):
u"\12\uffff"
)
-
+
DFA25_transition = [
DFA.unpack(u"\1\2\1\uffff\12\1"),
DFA.unpack(u"\1\3\1\uffff\12\1\12\uffff\1\5\1\4\1\5\35\uffff\1\5"
@@ -4479,7 +4479,7 @@ class CLexer(Lexer):
u"\u0192\uffff"
)
-
+
DFA35_transition = [
DFA.unpack(u"\6\73\2\70\1\73\2\70\22\73\1\70\1\50\1\65\1\72\1\63"
u"\1\45\1\46\1\64\1\34\1\35\1\40\1\42\1\3\1\43\1\41\1\44\1\66\11"
@@ -4943,5 +4943,5 @@ class CLexer(Lexer):
# class definition for DFA #35
DFA35 = DFA
-
+
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index 94711a9a378a..e817af86f702 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -2,7 +2,7 @@
from antlr3 import *
from antlr3.compat import set, frozenset
-
+
## @file
# The file defines the parser for C source files.
#
@@ -56,23 +56,23 @@ OctalEscape=17
# token names
tokenNames = [
- "<invalid>", "<EOR>", "<DOWN>", "<UP>",
- "IDENTIFIER", "HEX_LITERAL", "OCTAL_LITERAL", "DECIMAL_LITERAL", "CHARACTER_LITERAL",
- "STRING_LITERAL", "FLOATING_POINT_LITERAL", "LETTER", "EscapeSequence",
- "HexDigit", "IntegerTypeSuffix", "Exponent", "FloatTypeSuffix", "OctalEscape",
- "UnicodeEscape", "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
- "LINE_COMMAND", "';'", "'typedef'", "','", "'='", "'extern'", "'static'",
- "'auto'", "'register'", "'STATIC'", "'void'", "'char'", "'short'", "'int'",
- "'long'", "'float'", "'double'", "'signed'", "'unsigned'", "'{'", "'}'",
- "'struct'", "'union'", "':'", "'enum'", "'const'", "'volatile'", "'IN'",
- "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'", "'VOLATILE'", "'GLOBAL_REMOVE_IF_UNREFERENCED'",
- "'EFIAPI'", "'EFI_BOOTSERVICE'", "'EFI_RUNTIMESERVICE'", "'PACKED'",
- "'('", "')'", "'['", "']'", "'*'", "'...'", "'+'", "'-'", "'/'", "'%'",
- "'++'", "'--'", "'sizeof'", "'.'", "'->'", "'&'", "'~'", "'!'", "'*='",
- "'/='", "'%='", "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
- "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='", "'<'", "'>'", "'<='",
- "'>='", "'<<'", "'>>'", "'__asm__'", "'_asm'", "'__asm'", "'case'",
- "'default'", "'if'", "'else'", "'switch'", "'while'", "'do'", "'for'",
+ "<invalid>", "<EOR>", "<DOWN>", "<UP>",
+ "IDENTIFIER", "HEX_LITERAL", "OCTAL_LITERAL", "DECIMAL_LITERAL", "CHARACTER_LITERAL",
+ "STRING_LITERAL", "FLOATING_POINT_LITERAL", "LETTER", "EscapeSequence",
+ "HexDigit", "IntegerTypeSuffix", "Exponent", "FloatTypeSuffix", "OctalEscape",
+ "UnicodeEscape", "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
+ "LINE_COMMAND", "';'", "'typedef'", "','", "'='", "'extern'", "'static'",
+ "'auto'", "'register'", "'STATIC'", "'void'", "'char'", "'short'", "'int'",
+ "'long'", "'float'", "'double'", "'signed'", "'unsigned'", "'{'", "'}'",
+ "'struct'", "'union'", "':'", "'enum'", "'const'", "'volatile'", "'IN'",
+ "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'", "'VOLATILE'", "'GLOBAL_REMOVE_IF_UNREFERENCED'",
+ "'EFIAPI'", "'EFI_BOOTSERVICE'", "'EFI_RUNTIMESERVICE'", "'PACKED'",
+ "'('", "')'", "'['", "']'", "'*'", "'...'", "'+'", "'-'", "'/'", "'%'",
+ "'++'", "'--'", "'sizeof'", "'.'", "'->'", "'&'", "'~'", "'!'", "'*='",
+ "'/='", "'%='", "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
+ "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='", "'<'", "'>'", "'<='",
+ "'>='", "'<<'", "'>>'", "'__asm__'", "'_asm'", "'__asm'", "'case'",
+ "'default'", "'if'", "'else'", "'switch'", "'while'", "'do'", "'for'",
"'goto'", "'continue'", "'break'", "'return'"
]
@@ -103,7 +103,7 @@ class CParser(Parser):
def printTokenInfo(self, line, offset, tokenText):
print str(line)+ ',' + str(offset) + ':' + str(tokenText)
-
+
def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
FileProfile.PredicateExpressionList.append(PredExp)
@@ -119,7 +119,7 @@ class CParser(Parser):
def StoreTypedefDefinition(self, StartLine, StartOffset, EndLine, EndOffset, FromText, ToText):
Tdef = CodeFragment.TypedefDefinition(FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
FileProfile.TypedefDefinitionList.append(Tdef)
-
+
def StoreFunctionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText, LeftBraceLine, LeftBraceOffset, DeclLine, DeclOffset):
FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
FileProfile.FunctionDefinitionList.append(FuncDef)
@@ -127,11 +127,11 @@ class CParser(Parser):
def StoreVariableDeclaration(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText):
VarDecl = CodeFragment.VariableDeclaration(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
FileProfile.VariableDeclarationList.append(VarDecl)
-
+
def StoreFunctionCalling(self, StartLine, StartOffset, EndLine, EndOffset, FuncName, ParamList):
FuncCall = CodeFragment.FunctionCalling(FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
FileProfile.FunctionCallingList.append(FuncCall)
-
+
@@ -143,7 +143,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 1):
- return
+ return
# C.g:103:2: ( ( external_declaration )* )
# C.g:103:4: ( external_declaration )*
@@ -162,7 +162,7 @@ class CParser(Parser):
self.external_declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -182,7 +182,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end translation_unit
@@ -195,7 +195,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 2):
- return
+ return
# C.g:119:2: ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? )
alt3 = 3
@@ -211,7 +211,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 1, self.input)
@@ -227,7 +227,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 2, self.input)
@@ -243,7 +243,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 3, self.input)
@@ -259,7 +259,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 4, self.input)
@@ -275,7 +275,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 5, self.input)
@@ -291,7 +291,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 6, self.input)
@@ -307,7 +307,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 7, self.input)
@@ -323,7 +323,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 8, self.input)
@@ -339,7 +339,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 9, self.input)
@@ -355,7 +355,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 10, self.input)
@@ -371,7 +371,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 11, self.input)
@@ -387,7 +387,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 12, self.input)
@@ -405,7 +405,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 13, self.input)
@@ -421,7 +421,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 14, self.input)
@@ -439,7 +439,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 16, self.input)
@@ -455,7 +455,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 17, self.input)
@@ -471,7 +471,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 18, self.input)
@@ -484,7 +484,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 0, self.input)
@@ -496,7 +496,7 @@ class CParser(Parser):
self.function_definition()
self.following.pop()
if self.failed:
- return
+ return
elif alt3 == 2:
@@ -505,7 +505,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
elif alt3 == 3:
@@ -514,7 +514,7 @@ class CParser(Parser):
self.macro_statement()
self.following.pop()
if self.failed:
- return
+ return
# C.g:121:20: ( ';' )?
alt2 = 2
LA2_0 = self.input.LA(1)
@@ -525,7 +525,7 @@ class CParser(Parser):
# C.g:121:21: ';'
self.match(self.input, 25, self.FOLLOW_25_in_external_declaration126)
if self.failed:
- return
+ return
@@ -541,7 +541,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end external_declaration
@@ -568,7 +568,7 @@ class CParser(Parser):
declarator1 = None
-
+
self.function_definition_stack[-1].ModifierText = ''
self.function_definition_stack[-1].DeclText = ''
self.function_definition_stack[-1].LBLine = 0
@@ -782,7 +782,7 @@ class CParser(Parser):
if self.backtracking == 0:
-
+
if d is not None:
self.function_definition_stack[-1].ModifierText = self.input.toString(d.start,d.stop)
else:
@@ -796,7 +796,7 @@ class CParser(Parser):
else:
self.function_definition_stack[-1].LBLine = b.start.line
self.function_definition_stack[-1].LBOffset = b.start.charPositionInLine
-
+
@@ -804,7 +804,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
if self.backtracking == 0:
-
+
self.StoreFunctionDefinition(retval.start.line, retval.start.charPositionInLine, retval.stop.line, retval.stop.charPositionInLine, self.function_definition_stack[-1].ModifierText, self.function_definition_stack[-1].DeclText, self.function_definition_stack[-1].LBLine, self.function_definition_stack[-1].LBOffset, self.function_definition_stack[-1].DeclLine, self.function_definition_stack[-1].DeclOffset)
@@ -844,7 +844,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 4):
- return
+ return
# C.g:167:2: (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' )
alt9 = 2
@@ -857,7 +857,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("166:1: declaration : (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' );", 9, 0, self.input)
@@ -868,7 +868,7 @@ class CParser(Parser):
a = self.input.LT(1)
self.match(self.input, 26, self.FOLLOW_26_in_declaration203)
if self.failed:
- return
+ return
# C.g:167:17: (b= declaration_specifiers )?
alt7 = 2
LA7 = self.input.LA(1)
@@ -905,7 +905,7 @@ class CParser(Parser):
b = self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
@@ -913,18 +913,18 @@ class CParser(Parser):
c = self.init_declarator_list()
self.following.pop()
if self.failed:
- return
+ return
d = self.input.LT(1)
self.match(self.input, 25, self.FOLLOW_25_in_declaration220)
if self.failed:
- return
+ return
if self.backtracking == 0:
-
+
if b is not None:
self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop), self.input.toString(c.start,c.stop))
else:
self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.stop))
-
+
@@ -934,7 +934,7 @@ class CParser(Parser):
s = self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# C.g:175:30: (t= init_declarator_list )?
alt8 = 2
LA8_0 = self.input.LA(1)
@@ -947,16 +947,16 @@ class CParser(Parser):
t = self.init_declarator_list()
self.following.pop()
if self.failed:
- return
+ return
e = self.input.LT(1)
self.match(self.input, 25, self.FOLLOW_25_in_declaration243)
if self.failed:
- return
+ return
if self.backtracking == 0:
-
+
if t is not None:
self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start,s.stop), self.input.toString(t.start,t.stop))
@@ -973,7 +973,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end declaration
@@ -1184,7 +1184,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 7):
- return
+ return
# C.g:194:2: ( declarator ( '=' initializer )? )
# C.g:194:4: declarator ( '=' initializer )?
@@ -1192,7 +1192,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:194:15: ( '=' initializer )?
alt12 = 2
LA12_0 = self.input.LA(1)
@@ -1203,12 +1203,12 @@ class CParser(Parser):
# C.g:194:16: '=' initializer
self.match(self.input, 28, self.FOLLOW_28_in_init_declarator329)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_initializer_in_init_declarator331)
self.initializer()
self.following.pop()
if self.failed:
- return
+ return
@@ -1225,7 +1225,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end init_declarator
@@ -1238,7 +1238,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 8):
- return
+ return
# C.g:198:2: ( 'extern' | 'static' | 'auto' | 'register' | 'STATIC' )
# C.g:
@@ -1250,7 +1250,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -1272,7 +1272,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end storage_class_specifier
@@ -1290,7 +1290,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 9):
- return
+ return
# C.g:206:2: ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id )
alt13 = 12
@@ -1323,7 +1323,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("205:1: type_specifier : ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id );", 13, 0, self.input)
@@ -1333,63 +1333,63 @@ class CParser(Parser):
# C.g:206:4: 'void'
self.match(self.input, 34, self.FOLLOW_34_in_type_specifier376)
if self.failed:
- return
+ return
elif alt13 == 2:
# C.g:207:4: 'char'
self.match(self.input, 35, self.FOLLOW_35_in_type_specifier381)
if self.failed:
- return
+ return
elif alt13 == 3:
# C.g:208:4: 'short'
self.match(self.input, 36, self.FOLLOW_36_in_type_specifier386)
if self.failed:
- return
+ return
elif alt13 == 4:
# C.g:209:4: 'int'
self.match(self.input, 37, self.FOLLOW_37_in_type_specifier391)
if self.failed:
- return
+ return
elif alt13 == 5:
# C.g:210:4: 'long'
self.match(self.input, 38, self.FOLLOW_38_in_type_specifier396)
if self.failed:
- return
+ return
elif alt13 == 6:
# C.g:211:4: 'float'
self.match(self.input, 39, self.FOLLOW_39_in_type_specifier401)
if self.failed:
- return
+ return
elif alt13 == 7:
# C.g:212:4: 'double'
self.match(self.input, 40, self.FOLLOW_40_in_type_specifier406)
if self.failed:
- return
+ return
elif alt13 == 8:
# C.g:213:4: 'signed'
self.match(self.input, 41, self.FOLLOW_41_in_type_specifier411)
if self.failed:
- return
+ return
elif alt13 == 9:
# C.g:214:4: 'unsigned'
self.match(self.input, 42, self.FOLLOW_42_in_type_specifier416)
if self.failed:
- return
+ return
elif alt13 == 10:
@@ -1398,9 +1398,9 @@ class CParser(Parser):
s = self.struct_or_union_specifier()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
-
+
if s.stop is not None:
self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start,s.stop))
@@ -1413,9 +1413,9 @@ class CParser(Parser):
e = self.enum_specifier()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
-
+
if e.stop is not None:
self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -1428,7 +1428,7 @@ class CParser(Parser):
self.type_id()
self.following.pop()
if self.failed:
- return
+ return
@@ -1441,7 +1441,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end type_specifier
@@ -1454,13 +1454,13 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 10):
- return
+ return
# C.g:229:5: ( IDENTIFIER )
# C.g:229:9: IDENTIFIER
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_type_id467)
if self.failed:
- return
+ return
@@ -1474,7 +1474,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end type_id
@@ -1611,7 +1611,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 12):
- return
+ return
# C.g:240:2: ( 'struct' | 'union' )
# C.g:
@@ -1623,7 +1623,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -1645,7 +1645,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_or_union
@@ -1658,7 +1658,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 13):
- return
+ return
# C.g:245:2: ( ( struct_declaration )+ )
# C.g:245:4: ( struct_declaration )+
@@ -1678,7 +1678,7 @@ class CParser(Parser):
self.struct_declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -1687,7 +1687,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(16, self.input)
raise eee
@@ -1708,7 +1708,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_declaration_list
@@ -1721,7 +1721,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 14):
- return
+ return
# C.g:249:2: ( specifier_qualifier_list struct_declarator_list ';' )
# C.g:249:4: specifier_qualifier_list struct_declarator_list ';'
@@ -1729,15 +1729,15 @@ class CParser(Parser):
self.specifier_qualifier_list()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_struct_declarator_list_in_struct_declaration551)
self.struct_declarator_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_struct_declaration553)
if self.failed:
- return
+ return
@@ -1751,7 +1751,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_declaration
@@ -1764,7 +1764,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 15):
- return
+ return
# C.g:253:2: ( ( type_qualifier | type_specifier )+ )
# C.g:253:4: ( type_qualifier | type_specifier )+
@@ -1831,7 +1831,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
elif alt17 == 2:
@@ -1840,7 +1840,7 @@ class CParser(Parser):
self.type_specifier()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -1849,7 +1849,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(17, self.input)
raise eee
@@ -1870,7 +1870,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end specifier_qualifier_list
@@ -1883,7 +1883,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 16):
- return
+ return
# C.g:257:2: ( struct_declarator ( ',' struct_declarator )* )
# C.g:257:4: struct_declarator ( ',' struct_declarator )*
@@ -1891,7 +1891,7 @@ class CParser(Parser):
self.struct_declarator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:257:22: ( ',' struct_declarator )*
while True: #loop18
alt18 = 2
@@ -1905,12 +1905,12 @@ class CParser(Parser):
# C.g:257:23: ',' struct_declarator
self.match(self.input, 27, self.FOLLOW_27_in_struct_declarator_list587)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_struct_declarator_in_struct_declarator_list589)
self.struct_declarator()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -1930,7 +1930,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_declarator_list
@@ -1943,7 +1943,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 17):
- return
+ return
# C.g:261:2: ( declarator ( ':' constant_expression )? | ':' constant_expression )
alt20 = 2
@@ -1956,7 +1956,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("260:1: struct_declarator : ( declarator ( ':' constant_expression )? | ':' constant_expression );", 20, 0, self.input)
@@ -1968,7 +1968,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:261:15: ( ':' constant_expression )?
alt19 = 2
LA19_0 = self.input.LA(1)
@@ -1979,12 +1979,12 @@ class CParser(Parser):
# C.g:261:16: ':' constant_expression
self.match(self.input, 47, self.FOLLOW_47_in_struct_declarator605)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_struct_declarator607)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -1994,12 +1994,12 @@ class CParser(Parser):
# C.g:262:4: ':' constant_expression
self.match(self.input, 47, self.FOLLOW_47_in_struct_declarator614)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_struct_declarator616)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -2012,7 +2012,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_declarator
@@ -2180,7 +2180,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 19):
- return
+ return
# C.g:273:2: ( enumerator ( ',' enumerator )* )
# C.g:273:4: enumerator ( ',' enumerator )*
@@ -2188,7 +2188,7 @@ class CParser(Parser):
self.enumerator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:273:15: ( ',' enumerator )*
while True: #loop24
alt24 = 2
@@ -2207,12 +2207,12 @@ class CParser(Parser):
# C.g:273:16: ',' enumerator
self.match(self.input, 27, self.FOLLOW_27_in_enumerator_list680)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_enumerator_in_enumerator_list682)
self.enumerator()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -2232,7 +2232,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end enumerator_list
@@ -2245,13 +2245,13 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 20):
- return
+ return
# C.g:277:2: ( IDENTIFIER ( '=' constant_expression )? )
# C.g:277:4: IDENTIFIER ( '=' constant_expression )?
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_enumerator695)
if self.failed:
- return
+ return
# C.g:277:15: ( '=' constant_expression )?
alt25 = 2
LA25_0 = self.input.LA(1)
@@ -2262,12 +2262,12 @@ class CParser(Parser):
# C.g:277:16: '=' constant_expression
self.match(self.input, 28, self.FOLLOW_28_in_enumerator698)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_enumerator700)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -2284,7 +2284,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end enumerator
@@ -2297,7 +2297,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 21):
- return
+ return
# C.g:281:2: ( 'const' | 'volatile' | 'IN' | 'OUT' | 'OPTIONAL' | 'CONST' | 'UNALIGNED' | 'VOLATILE' | 'GLOBAL_REMOVE_IF_UNREFERENCED' | 'EFIAPI' | 'EFI_BOOTSERVICE' | 'EFI_RUNTIMESERVICE' | 'PACKED' )
# C.g:
@@ -2309,7 +2309,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -2331,7 +2331,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end type_qualifier
@@ -2486,7 +2486,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 23):
- return
+ return
# C.g:303:2: ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ )
alt34 = 2
@@ -2499,7 +2499,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("302:1: direct_declarator : ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ );", 34, 0, self.input)
@@ -2509,7 +2509,7 @@ class CParser(Parser):
# C.g:303:4: IDENTIFIER ( declarator_suffix )*
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_direct_declarator819)
if self.failed:
- return
+ return
# C.g:303:15: ( declarator_suffix )*
while True: #loop31
alt31 = 2
@@ -2753,7 +2753,7 @@ class CParser(Parser):
self.declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -2766,7 +2766,7 @@ class CParser(Parser):
# C.g:304:4: '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+
self.match(self.input, 62, self.FOLLOW_62_in_direct_declarator827)
if self.failed:
- return
+ return
# C.g:304:8: ( 'EFIAPI' )?
alt32 = 2
LA32_0 = self.input.LA(1)
@@ -2780,7 +2780,7 @@ class CParser(Parser):
# C.g:304:9: 'EFIAPI'
self.match(self.input, 58, self.FOLLOW_58_in_direct_declarator830)
if self.failed:
- return
+ return
@@ -2788,10 +2788,10 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_direct_declarator836)
if self.failed:
- return
+ return
# C.g:304:35: ( declarator_suffix )+
cnt33 = 0
while True: #loop33
@@ -3036,7 +3036,7 @@ class CParser(Parser):
self.declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3045,7 +3045,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(33, self.input)
raise eee
@@ -3065,7 +3065,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end direct_declarator
@@ -3078,7 +3078,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 24):
- return
+ return
# C.g:308:2: ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' )
alt35 = 5
@@ -3094,7 +3094,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 1, self.input)
@@ -3116,7 +3116,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 29, self.input)
@@ -3125,7 +3125,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 2, self.input)
@@ -3134,7 +3134,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 0, self.input)
@@ -3144,65 +3144,65 @@ class CParser(Parser):
# C.g:308:6: '[' constant_expression ']'
self.match(self.input, 64, self.FOLLOW_64_in_declarator_suffix852)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_declarator_suffix854)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_declarator_suffix856)
if self.failed:
- return
+ return
elif alt35 == 2:
# C.g:309:9: '[' ']'
self.match(self.input, 64, self.FOLLOW_64_in_declarator_suffix866)
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_declarator_suffix868)
if self.failed:
- return
+ return
elif alt35 == 3:
# C.g:310:9: '(' parameter_type_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix878)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_parameter_type_list_in_declarator_suffix880)
self.parameter_type_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix882)
if self.failed:
- return
+ return
elif alt35 == 4:
# C.g:311:9: '(' identifier_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix892)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_identifier_list_in_declarator_suffix894)
self.identifier_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix896)
if self.failed:
- return
+ return
elif alt35 == 5:
# C.g:312:9: '(' ')'
self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix906)
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix908)
if self.failed:
- return
+ return
@@ -3215,7 +3215,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end declarator_suffix
@@ -3228,7 +3228,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 25):
- return
+ return
# C.g:316:2: ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' )
alt38 = 3
@@ -3246,7 +3246,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 2, self.input)
@@ -3262,7 +3262,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 3, self.input)
@@ -3278,7 +3278,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 4, self.input)
@@ -3294,7 +3294,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 5, self.input)
@@ -3312,7 +3312,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 21, self.input)
@@ -3328,7 +3328,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 29, self.input)
@@ -3337,7 +3337,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 1, self.input)
@@ -3346,7 +3346,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 0, self.input)
@@ -3356,7 +3356,7 @@ class CParser(Parser):
# C.g:316:4: '*' ( type_qualifier )+ ( pointer )?
self.match(self.input, 66, self.FOLLOW_66_in_pointer919)
if self.failed:
- return
+ return
# C.g:316:8: ( type_qualifier )+
cnt36 = 0
while True: #loop36
@@ -3404,7 +3404,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3413,7 +3413,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(36, self.input)
raise eee
@@ -3436,7 +3436,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
@@ -3446,19 +3446,19 @@ class CParser(Parser):
# C.g:317:4: '*' pointer
self.match(self.input, 66, self.FOLLOW_66_in_pointer930)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_pointer_in_pointer932)
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
elif alt38 == 3:
# C.g:318:4: '*'
self.match(self.input, 66, self.FOLLOW_66_in_pointer937)
if self.failed:
- return
+ return
@@ -3471,7 +3471,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end pointer
@@ -3484,7 +3484,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 26):
- return
+ return
# C.g:322:2: ( parameter_list ( ',' ( 'OPTIONAL' )? '...' )? )
# C.g:322:4: parameter_list ( ',' ( 'OPTIONAL' )? '...' )?
@@ -3492,7 +3492,7 @@ class CParser(Parser):
self.parameter_list()
self.following.pop()
if self.failed:
- return
+ return
# C.g:322:19: ( ',' ( 'OPTIONAL' )? '...' )?
alt40 = 2
LA40_0 = self.input.LA(1)
@@ -3503,7 +3503,7 @@ class CParser(Parser):
# C.g:322:20: ',' ( 'OPTIONAL' )? '...'
self.match(self.input, 27, self.FOLLOW_27_in_parameter_type_list951)
if self.failed:
- return
+ return
# C.g:322:24: ( 'OPTIONAL' )?
alt39 = 2
LA39_0 = self.input.LA(1)
@@ -3514,13 +3514,13 @@ class CParser(Parser):
# C.g:322:25: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_parameter_type_list954)
if self.failed:
- return
+ return
self.match(self.input, 67, self.FOLLOW_67_in_parameter_type_list958)
if self.failed:
- return
+ return
@@ -3537,7 +3537,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end parameter_type_list
@@ -3550,7 +3550,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 27):
- return
+ return
# C.g:326:2: ( parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )* )
# C.g:326:4: parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )*
@@ -3558,7 +3558,7 @@ class CParser(Parser):
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
# C.g:326:26: ( ',' ( 'OPTIONAL' )? parameter_declaration )*
while True: #loop42
alt42 = 2
@@ -3584,7 +3584,7 @@ class CParser(Parser):
# C.g:326:27: ',' ( 'OPTIONAL' )? parameter_declaration
self.match(self.input, 27, self.FOLLOW_27_in_parameter_list974)
if self.failed:
- return
+ return
# C.g:326:31: ( 'OPTIONAL' )?
alt41 = 2
LA41_0 = self.input.LA(1)
@@ -3598,7 +3598,7 @@ class CParser(Parser):
# C.g:326:32: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_parameter_list977)
if self.failed:
- return
+ return
@@ -3606,7 +3606,7 @@ class CParser(Parser):
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3626,7 +3626,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end parameter_list
@@ -3639,7 +3639,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 28):
- return
+ return
# C.g:330:2: ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER )
alt46 = 2
@@ -3656,7 +3656,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 13, self.input)
@@ -3667,7 +3667,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 0, self.input)
@@ -3679,7 +3679,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# C.g:330:27: ( declarator | abstract_declarator )*
while True: #loop43
alt43 = 3
@@ -3763,7 +3763,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
elif alt43 == 2:
@@ -3772,7 +3772,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3789,7 +3789,7 @@ class CParser(Parser):
# C.g:330:62: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_parameter_declaration1004)
if self.failed:
- return
+ return
@@ -3812,7 +3812,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3821,7 +3821,7 @@ class CParser(Parser):
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_parameter_declaration1016)
if self.failed:
- return
+ return
@@ -3834,7 +3834,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end parameter_declaration
@@ -3847,13 +3847,13 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 29):
- return
+ return
# C.g:336:2: ( IDENTIFIER ( ',' IDENTIFIER )* )
# C.g:336:4: IDENTIFIER ( ',' IDENTIFIER )*
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_identifier_list1027)
if self.failed:
- return
+ return
# C.g:337:2: ( ',' IDENTIFIER )*
while True: #loop47
alt47 = 2
@@ -3867,10 +3867,10 @@ class CParser(Parser):
# C.g:337:3: ',' IDENTIFIER
self.match(self.input, 27, self.FOLLOW_27_in_identifier_list1031)
if self.failed:
- return
+ return
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_identifier_list1033)
if self.failed:
- return
+ return
else:
@@ -3890,7 +3890,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end identifier_list
@@ -3903,7 +3903,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 30):
- return
+ return
# C.g:341:2: ( specifier_qualifier_list ( abstract_declarator )? | type_id )
alt49 = 2
@@ -3921,7 +3921,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 13, self.input)
@@ -3930,7 +3930,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 0, self.input)
@@ -3942,7 +3942,7 @@ class CParser(Parser):
self.specifier_qualifier_list()
self.following.pop()
if self.failed:
- return
+ return
# C.g:341:29: ( abstract_declarator )?
alt48 = 2
LA48_0 = self.input.LA(1)
@@ -3955,7 +3955,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
@@ -3967,7 +3967,7 @@ class CParser(Parser):
self.type_id()
self.following.pop()
if self.failed:
- return
+ return
@@ -3980,7 +3980,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end type_name
@@ -3993,7 +3993,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 31):
- return
+ return
# C.g:346:2: ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator )
alt51 = 2
@@ -4006,7 +4006,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("345:1: abstract_declarator : ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator );", 51, 0, self.input)
@@ -4018,7 +4018,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
# C.g:346:12: ( direct_abstract_declarator )?
alt50 = 2
LA50_0 = self.input.LA(1)
@@ -4203,7 +4203,7 @@ class CParser(Parser):
self.direct_abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
@@ -4215,7 +4215,7 @@ class CParser(Parser):
self.direct_abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
@@ -4228,7 +4228,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end abstract_declarator
@@ -4241,7 +4241,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 32):
- return
+ return
# C.g:351:2: ( ( '(' abstract_declarator ')' | abstract_declarator_suffix ) ( abstract_declarator_suffix )* )
# C.g:351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix ) ( abstract_declarator_suffix )*
@@ -4263,7 +4263,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 18, self.input)
@@ -4274,7 +4274,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 1, self.input)
@@ -4285,7 +4285,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 0, self.input)
@@ -4295,15 +4295,15 @@ class CParser(Parser):
# C.g:351:6: '(' abstract_declarator ')'
self.match(self.input, 62, self.FOLLOW_62_in_direct_abstract_declarator1086)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_abstract_declarator_in_direct_abstract_declarator1088)
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_direct_abstract_declarator1090)
if self.failed:
- return
+ return
elif alt52 == 2:
@@ -4312,7 +4312,7 @@ class CParser(Parser):
self.abstract_declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
@@ -4559,7 +4559,7 @@ class CParser(Parser):
self.abstract_declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -4579,7 +4579,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end direct_abstract_declarator
@@ -4592,7 +4592,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 33):
- return
+ return
# C.g:355:2: ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' )
alt54 = 4
@@ -4608,7 +4608,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 1, self.input)
@@ -4624,7 +4624,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 2, self.input)
@@ -4633,7 +4633,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 0, self.input)
@@ -4643,50 +4643,50 @@ class CParser(Parser):
# C.g:355:4: '[' ']'
self.match(self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1110)
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1112)
if self.failed:
- return
+ return
elif alt54 == 2:
# C.g:356:4: '[' constant_expression ']'
self.match(self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1117)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_abstract_declarator_suffix1119)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1121)
if self.failed:
- return
+ return
elif alt54 == 3:
# C.g:357:4: '(' ')'
self.match(self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1126)
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1128)
if self.failed:
- return
+ return
elif alt54 == 4:
# C.g:358:4: '(' parameter_type_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1133)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135)
self.parameter_type_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1137)
if self.failed:
- return
+ return
@@ -4699,7 +4699,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end abstract_declarator_suffix
@@ -4712,7 +4712,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 34):
- return
+ return
# C.g:363:2: ( assignment_expression | '{' initializer_list ( ',' )? '}' )
alt56 = 2
@@ -4725,7 +4725,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("361:1: initializer : ( assignment_expression | '{' initializer_list ( ',' )? '}' );", 56, 0, self.input)
@@ -4737,19 +4737,19 @@ class CParser(Parser):
self.assignment_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt56 == 2:
# C.g:364:4: '{' initializer_list ( ',' )? '}'
self.match(self.input, 43, self.FOLLOW_43_in_initializer1155)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_initializer_list_in_initializer1157)
self.initializer_list()
self.following.pop()
if self.failed:
- return
+ return
# C.g:364:25: ( ',' )?
alt55 = 2
LA55_0 = self.input.LA(1)
@@ -4760,13 +4760,13 @@ class CParser(Parser):
# C.g:0:0: ','
self.match(self.input, 27, self.FOLLOW_27_in_initializer1159)
if self.failed:
- return
+ return
self.match(self.input, 44, self.FOLLOW_44_in_initializer1162)
if self.failed:
- return
+ return
@@ -4779,7 +4779,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end initializer
@@ -4792,7 +4792,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 35):
- return
+ return
# C.g:368:2: ( initializer ( ',' initializer )* )
# C.g:368:4: initializer ( ',' initializer )*
@@ -4800,7 +4800,7 @@ class CParser(Parser):
self.initializer()
self.following.pop()
if self.failed:
- return
+ return
# C.g:368:16: ( ',' initializer )*
while True: #loop57
alt57 = 2
@@ -4819,12 +4819,12 @@ class CParser(Parser):
# C.g:368:17: ',' initializer
self.match(self.input, 27, self.FOLLOW_27_in_initializer_list1176)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_initializer_in_initializer_list1178)
self.initializer()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -4844,7 +4844,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end initializer_list
@@ -4955,7 +4955,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 37):
- return
+ return
# C.g:378:2: ( ( multiplicative_expression ) ( '+' multiplicative_expression | '-' multiplicative_expression )* )
# C.g:378:4: ( multiplicative_expression ) ( '+' multiplicative_expression | '-' multiplicative_expression )*
@@ -4965,7 +4965,7 @@ class CParser(Parser):
self.multiplicative_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -4984,24 +4984,24 @@ class CParser(Parser):
# C.g:378:33: '+' multiplicative_expression
self.match(self.input, 68, self.FOLLOW_68_in_additive_expression1229)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1231)
self.multiplicative_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt61 == 2:
# C.g:378:65: '-' multiplicative_expression
self.match(self.input, 69, self.FOLLOW_69_in_additive_expression1235)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1237)
self.multiplicative_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -5021,7 +5021,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end additive_expression
@@ -5034,7 +5034,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 38):
- return
+ return
# C.g:382:2: ( ( cast_expression ) ( '*' cast_expression | '/' cast_expression | '%' cast_expression )* )
# C.g:382:4: ( cast_expression ) ( '*' cast_expression | '/' cast_expression | '%' cast_expression )*
@@ -5044,7 +5044,7 @@ class CParser(Parser):
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -5063,36 +5063,36 @@ class CParser(Parser):
# C.g:382:23: '*' cast_expression
self.match(self.input, 66, self.FOLLOW_66_in_multiplicative_expression1255)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1257)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt62 == 2:
# C.g:382:45: '/' cast_expression
self.match(self.input, 70, self.FOLLOW_70_in_multiplicative_expression1261)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1263)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt62 == 3:
# C.g:382:67: '%' cast_expression
self.match(self.input, 71, self.FOLLOW_71_in_multiplicative_expression1267)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1269)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -5112,7 +5112,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end multiplicative_expression
@@ -5125,7 +5125,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 39):
- return
+ return
# C.g:386:2: ( '(' type_name ')' cast_expression | unary_expression )
alt63 = 2
@@ -5145,7 +5145,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 25, self.input)
@@ -5156,7 +5156,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 1, self.input)
@@ -5167,7 +5167,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 0, self.input)
@@ -5177,20 +5177,20 @@ class CParser(Parser):
# C.g:386:4: '(' type_name ')' cast_expression
self.match(self.input, 62, self.FOLLOW_62_in_cast_expression1282)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_type_name_in_cast_expression1284)
self.type_name()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_cast_expression1286)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_cast_expression1288)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt63 == 2:
@@ -5199,7 +5199,7 @@ class CParser(Parser):
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -5212,7 +5212,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end cast_expression
@@ -5225,7 +5225,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 40):
- return
+ return
# C.g:391:2: ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' )
alt64 = 6
@@ -5251,7 +5251,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 13, self.input)
@@ -5262,7 +5262,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 12, self.input)
@@ -5271,7 +5271,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 0, self.input)
@@ -5283,31 +5283,31 @@ class CParser(Parser):
self.postfix_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 2:
# C.g:392:4: '++' unary_expression
self.match(self.input, 72, self.FOLLOW_72_in_unary_expression1309)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_unary_expression_in_unary_expression1311)
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 3:
# C.g:393:4: '--' unary_expression
self.match(self.input, 73, self.FOLLOW_73_in_unary_expression1316)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_unary_expression_in_unary_expression1318)
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 4:
@@ -5316,42 +5316,42 @@ class CParser(Parser):
self.unary_operator()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_unary_expression1325)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 5:
# C.g:395:4: 'sizeof' unary_expression
self.match(self.input, 74, self.FOLLOW_74_in_unary_expression1330)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_unary_expression_in_unary_expression1332)
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 6:
# C.g:396:4: 'sizeof' '(' type_name ')'
self.match(self.input, 74, self.FOLLOW_74_in_unary_expression1337)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_unary_expression1339)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_type_name_in_unary_expression1341)
self.type_name()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_unary_expression1343)
if self.failed:
- return
+ return
@@ -5364,7 +5364,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end unary_expression
@@ -5384,13 +5384,13 @@ class CParser(Parser):
c = None
-
+
self.postfix_expression_stack[-1].FuncCallText = ''
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 41):
- return
+ return
# C.g:406:2: (p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )* )
# C.g:406:6: p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
@@ -5398,7 +5398,7 @@ class CParser(Parser):
p = self.primary_expression()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start,p.stop)
@@ -5460,26 +5460,26 @@ class CParser(Parser):
# C.g:407:13: '[' expression ']'
self.match(self.input, 64, self.FOLLOW_64_in_postfix_expression1383)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_postfix_expression1385)
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_postfix_expression1387)
if self.failed:
- return
+ return
elif alt65 == 2:
# C.g:408:13: '(' a= ')'
self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1401)
if self.failed:
- return
+ return
a = self.input.LT(1)
self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1405)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, a.line, a.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, '')
@@ -5489,16 +5489,16 @@ class CParser(Parser):
# C.g:409:13: '(' c= argument_expression_list b= ')'
self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1420)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_argument_expression_list_in_postfix_expression1424)
c = self.argument_expression_list()
self.following.pop()
if self.failed:
- return
+ return
b = self.input.LT(1)
self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1428)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start,c.stop))
@@ -5508,26 +5508,26 @@ class CParser(Parser):
# C.g:410:13: '(' macro_parameter_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1444)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_macro_parameter_list_in_postfix_expression1446)
self.macro_parameter_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1448)
if self.failed:
- return
+ return
elif alt65 == 5:
# C.g:411:13: '.' x= IDENTIFIER
self.match(self.input, 75, self.FOLLOW_75_in_postfix_expression1462)
if self.failed:
- return
+ return
x = self.input.LT(1)
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1466)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.postfix_expression_stack[-1].FuncCallText += '.' + x.text
@@ -5537,11 +5537,11 @@ class CParser(Parser):
# C.g:412:13: '*' y= IDENTIFIER
self.match(self.input, 66, self.FOLLOW_66_in_postfix_expression1482)
if self.failed:
- return
+ return
y = self.input.LT(1)
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1486)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.postfix_expression_stack[-1].FuncCallText = y.text
@@ -5551,11 +5551,11 @@ class CParser(Parser):
# C.g:413:13: '->' z= IDENTIFIER
self.match(self.input, 76, self.FOLLOW_76_in_postfix_expression1502)
if self.failed:
- return
+ return
z = self.input.LT(1)
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1506)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.postfix_expression_stack[-1].FuncCallText += '->' + z.text
@@ -5565,14 +5565,14 @@ class CParser(Parser):
# C.g:414:13: '++'
self.match(self.input, 72, self.FOLLOW_72_in_postfix_expression1522)
if self.failed:
- return
+ return
elif alt65 == 9:
# C.g:415:13: '--'
self.match(self.input, 73, self.FOLLOW_73_in_postfix_expression1536)
if self.failed:
- return
+ return
else:
@@ -5593,7 +5593,7 @@ class CParser(Parser):
self.postfix_expression_stack.pop()
pass
- return
+ return
# $ANTLR end postfix_expression
@@ -5606,7 +5606,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 42):
- return
+ return
# C.g:420:2: ( parameter_declaration ( ',' parameter_declaration )* )
# C.g:420:4: parameter_declaration ( ',' parameter_declaration )*
@@ -5614,7 +5614,7 @@ class CParser(Parser):
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
# C.g:420:26: ( ',' parameter_declaration )*
while True: #loop66
alt66 = 2
@@ -5628,12 +5628,12 @@ class CParser(Parser):
# C.g:420:27: ',' parameter_declaration
self.match(self.input, 27, self.FOLLOW_27_in_macro_parameter_list1562)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_parameter_declaration_in_macro_parameter_list1564)
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -5653,7 +5653,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end macro_parameter_list
@@ -5666,7 +5666,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 43):
- return
+ return
# C.g:424:2: ( '&' | '*' | '+' | '-' | '~' | '!' )
# C.g:
@@ -5678,7 +5678,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -5700,7 +5700,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end unary_operator
@@ -5811,7 +5811,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 45):
- return
+ return
# C.g:439:5: ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL )
alt72 = 6
@@ -5831,7 +5831,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("438:1: constant : ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL );", 72, 0, self.input)
@@ -5841,28 +5841,28 @@ class CParser(Parser):
# C.g:439:9: HEX_LITERAL
self.match(self.input, HEX_LITERAL, self.FOLLOW_HEX_LITERAL_in_constant1643)
if self.failed:
- return
+ return
elif alt72 == 2:
# C.g:440:9: OCTAL_LITERAL
self.match(self.input, OCTAL_LITERAL, self.FOLLOW_OCTAL_LITERAL_in_constant1653)
if self.failed:
- return
+ return
elif alt72 == 3:
# C.g:441:9: DECIMAL_LITERAL
self.match(self.input, DECIMAL_LITERAL, self.FOLLOW_DECIMAL_LITERAL_in_constant1663)
if self.failed:
- return
+ return
elif alt72 == 4:
# C.g:442:7: CHARACTER_LITERAL
self.match(self.input, CHARACTER_LITERAL, self.FOLLOW_CHARACTER_LITERAL_in_constant1671)
if self.failed:
- return
+ return
elif alt72 == 5:
@@ -5906,7 +5906,7 @@ class CParser(Parser):
# C.g:0:0: IDENTIFIER
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1680)
if self.failed:
- return
+ return
else:
@@ -5932,7 +5932,7 @@ class CParser(Parser):
# C.g:0:0: STRING_LITERAL
self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_constant1683)
if self.failed:
- return
+ return
else:
@@ -5941,7 +5941,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(69, self.input)
raise eee
@@ -5957,7 +5957,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(70, self.input)
raise eee
@@ -5978,7 +5978,7 @@ class CParser(Parser):
# C.g:0:0: IDENTIFIER
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1688)
if self.failed:
- return
+ return
else:
@@ -5991,7 +5991,7 @@ class CParser(Parser):
# C.g:444:9: FLOATING_POINT_LITERAL
self.match(self.input, FLOATING_POINT_LITERAL, self.FOLLOW_FLOATING_POINT_LITERAL_in_constant1699)
if self.failed:
- return
+ return
@@ -6004,7 +6004,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end constant
@@ -6087,7 +6087,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 47):
- return
+ return
# C.g:454:2: ( conditional_expression )
# C.g:454:4: conditional_expression
@@ -6095,7 +6095,7 @@ class CParser(Parser):
self.conditional_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -6109,7 +6109,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end constant_expression
@@ -6122,7 +6122,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 48):
- return
+ return
# C.g:458:2: ( lvalue assignment_operator assignment_expression | conditional_expression )
alt74 = 2
@@ -6139,7 +6139,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 13, self.input)
@@ -6155,7 +6155,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 14, self.input)
@@ -6171,7 +6171,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 15, self.input)
@@ -6187,7 +6187,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 16, self.input)
@@ -6203,7 +6203,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 17, self.input)
@@ -6219,7 +6219,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 18, self.input)
@@ -6235,7 +6235,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 19, self.input)
@@ -6253,7 +6253,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 21, self.input)
@@ -6269,7 +6269,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 22, self.input)
@@ -6280,7 +6280,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 1, self.input)
@@ -6298,7 +6298,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 44, self.input)
@@ -6314,7 +6314,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 45, self.input)
@@ -6330,7 +6330,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 46, self.input)
@@ -6346,7 +6346,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 47, self.input)
@@ -6362,7 +6362,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 48, self.input)
@@ -6378,7 +6378,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 49, self.input)
@@ -6394,7 +6394,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 50, self.input)
@@ -6407,7 +6407,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 2, self.input)
@@ -6425,7 +6425,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 73, self.input)
@@ -6441,7 +6441,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 74, self.input)
@@ -6457,7 +6457,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 75, self.input)
@@ -6473,7 +6473,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 76, self.input)
@@ -6489,7 +6489,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 77, self.input)
@@ -6505,7 +6505,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 78, self.input)
@@ -6521,7 +6521,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 79, self.input)
@@ -6534,7 +6534,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 3, self.input)
@@ -6552,7 +6552,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 102, self.input)
@@ -6568,7 +6568,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 103, self.input)
@@ -6584,7 +6584,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 104, self.input)
@@ -6600,7 +6600,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 105, self.input)
@@ -6616,7 +6616,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 106, self.input)
@@ -6632,7 +6632,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 107, self.input)
@@ -6648,7 +6648,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 108, self.input)
@@ -6661,7 +6661,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 4, self.input)
@@ -6679,7 +6679,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 131, self.input)
@@ -6695,7 +6695,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 132, self.input)
@@ -6711,7 +6711,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 133, self.input)
@@ -6727,7 +6727,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 134, self.input)
@@ -6743,7 +6743,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 135, self.input)
@@ -6759,7 +6759,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 136, self.input)
@@ -6775,7 +6775,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 137, self.input)
@@ -6788,7 +6788,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 5, self.input)
@@ -6806,7 +6806,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 160, self.input)
@@ -6822,7 +6822,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 161, self.input)
@@ -6838,7 +6838,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 162, self.input)
@@ -6854,7 +6854,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 163, self.input)
@@ -6870,7 +6870,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 164, self.input)
@@ -6886,7 +6886,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 165, self.input)
@@ -6902,7 +6902,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 166, self.input)
@@ -6918,7 +6918,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 167, self.input)
@@ -6936,7 +6936,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 189, self.input)
@@ -6947,7 +6947,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 6, self.input)
@@ -6965,7 +6965,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 191, self.input)
@@ -6981,7 +6981,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 192, self.input)
@@ -6997,7 +6997,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 193, self.input)
@@ -7013,7 +7013,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 194, self.input)
@@ -7029,7 +7029,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 195, self.input)
@@ -7045,7 +7045,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 196, self.input)
@@ -7061,7 +7061,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 197, self.input)
@@ -7074,7 +7074,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 7, self.input)
@@ -7092,7 +7092,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 220, self.input)
@@ -7108,7 +7108,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 221, self.input)
@@ -7124,7 +7124,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 222, self.input)
@@ -7140,7 +7140,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 223, self.input)
@@ -7156,7 +7156,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 224, self.input)
@@ -7172,7 +7172,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 225, self.input)
@@ -7188,7 +7188,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 226, self.input)
@@ -7204,7 +7204,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 227, self.input)
@@ -7220,7 +7220,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 228, self.input)
@@ -7236,7 +7236,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 229, self.input)
@@ -7252,7 +7252,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 230, self.input)
@@ -7268,7 +7268,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 231, self.input)
@@ -7279,7 +7279,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 8, self.input)
@@ -7297,7 +7297,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 244, self.input)
@@ -7313,7 +7313,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 245, self.input)
@@ -7329,7 +7329,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 246, self.input)
@@ -7345,7 +7345,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 247, self.input)
@@ -7361,7 +7361,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 248, self.input)
@@ -7377,7 +7377,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 249, self.input)
@@ -7393,7 +7393,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 250, self.input)
@@ -7409,7 +7409,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 251, self.input)
@@ -7425,7 +7425,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 252, self.input)
@@ -7441,7 +7441,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 253, self.input)
@@ -7457,7 +7457,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 254, self.input)
@@ -7473,7 +7473,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 255, self.input)
@@ -7482,7 +7482,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 9, self.input)
@@ -7500,7 +7500,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 256, self.input)
@@ -7516,7 +7516,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 257, self.input)
@@ -7532,7 +7532,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 258, self.input)
@@ -7548,7 +7548,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 259, self.input)
@@ -7564,7 +7564,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 260, self.input)
@@ -7580,7 +7580,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 261, self.input)
@@ -7596,7 +7596,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 262, self.input)
@@ -7612,7 +7612,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 263, self.input)
@@ -7628,7 +7628,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 264, self.input)
@@ -7644,7 +7644,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 265, self.input)
@@ -7660,7 +7660,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 266, self.input)
@@ -7676,7 +7676,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 267, self.input)
@@ -7685,7 +7685,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 10, self.input)
@@ -7703,7 +7703,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 268, self.input)
@@ -7719,7 +7719,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 269, self.input)
@@ -7735,7 +7735,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 270, self.input)
@@ -7751,7 +7751,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 271, self.input)
@@ -7767,7 +7767,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 272, self.input)
@@ -7783,7 +7783,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 273, self.input)
@@ -7799,7 +7799,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 274, self.input)
@@ -7815,7 +7815,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 275, self.input)
@@ -7831,7 +7831,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 276, self.input)
@@ -7847,7 +7847,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 277, self.input)
@@ -7863,7 +7863,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 278, self.input)
@@ -7879,7 +7879,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 279, self.input)
@@ -7888,7 +7888,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 11, self.input)
@@ -7906,7 +7906,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 280, self.input)
@@ -7922,7 +7922,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 281, self.input)
@@ -7938,7 +7938,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 282, self.input)
@@ -7954,7 +7954,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 283, self.input)
@@ -7970,7 +7970,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 284, self.input)
@@ -7986,7 +7986,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 285, self.input)
@@ -8002,7 +8002,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 286, self.input)
@@ -8018,7 +8018,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 287, self.input)
@@ -8034,7 +8034,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 288, self.input)
@@ -8050,7 +8050,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 289, self.input)
@@ -8066,7 +8066,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 290, self.input)
@@ -8082,7 +8082,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 291, self.input)
@@ -8091,7 +8091,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 12, self.input)
@@ -8100,7 +8100,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 0, self.input)
@@ -8112,17 +8112,17 @@ class CParser(Parser):
self.lvalue()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_assignment_operator_in_assignment_expression1746)
self.assignment_operator()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_assignment_expression_in_assignment_expression1748)
self.assignment_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt74 == 2:
@@ -8131,7 +8131,7 @@ class CParser(Parser):
self.conditional_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -8144,7 +8144,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end assignment_expression
@@ -8157,7 +8157,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 49):
- return
+ return
# C.g:463:2: ( unary_expression )
# C.g:463:4: unary_expression
@@ -8165,7 +8165,7 @@ class CParser(Parser):
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -8179,7 +8179,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end lvalue
@@ -8192,7 +8192,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 50):
- return
+ return
# C.g:467:2: ( '=' | '*=' | '/=' | '%=' | '+=' | '-=' | '<<=' | '>>=' | '&=' | '^=' | '|=' )
# C.g:
@@ -8204,7 +8204,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -8226,7 +8226,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end assignment_operator
@@ -8242,7 +8242,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 51):
- return
+ return
# C.g:481:2: (e= logical_or_expression ( '?' expression ':' conditional_expression )? )
# C.g:481:4: e= logical_or_expression ( '?' expression ':' conditional_expression )?
@@ -8250,7 +8250,7 @@ class CParser(Parser):
e = self.logical_or_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:481:28: ( '?' expression ':' conditional_expression )?
alt75 = 2
LA75_0 = self.input.LA(1)
@@ -8261,20 +8261,20 @@ class CParser(Parser):
# C.g:481:29: '?' expression ':' conditional_expression
self.match(self.input, 90, self.FOLLOW_90_in_conditional_expression1842)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_conditional_expression1844)
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 47, self.FOLLOW_47_in_conditional_expression1846)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_conditional_expression_in_conditional_expression1848)
self.conditional_expression()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -8294,7 +8294,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end conditional_expression
@@ -8377,7 +8377,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 53):
- return
+ return
# C.g:489:2: ( inclusive_or_expression ( '&&' inclusive_or_expression )* )
# C.g:489:4: inclusive_or_expression ( '&&' inclusive_or_expression )*
@@ -8385,7 +8385,7 @@ class CParser(Parser):
self.inclusive_or_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:489:28: ( '&&' inclusive_or_expression )*
while True: #loop77
alt77 = 2
@@ -8399,12 +8399,12 @@ class CParser(Parser):
# C.g:489:29: '&&' inclusive_or_expression
self.match(self.input, 92, self.FOLLOW_92_in_logical_and_expression1884)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_inclusive_or_expression_in_logical_and_expression1886)
self.inclusive_or_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8424,7 +8424,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end logical_and_expression
@@ -8437,7 +8437,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 54):
- return
+ return
# C.g:493:2: ( exclusive_or_expression ( '|' exclusive_or_expression )* )
# C.g:493:4: exclusive_or_expression ( '|' exclusive_or_expression )*
@@ -8445,7 +8445,7 @@ class CParser(Parser):
self.exclusive_or_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:493:28: ( '|' exclusive_or_expression )*
while True: #loop78
alt78 = 2
@@ -8459,12 +8459,12 @@ class CParser(Parser):
# C.g:493:29: '|' exclusive_or_expression
self.match(self.input, 93, self.FOLLOW_93_in_inclusive_or_expression1902)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904)
self.exclusive_or_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8484,7 +8484,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end inclusive_or_expression
@@ -8497,7 +8497,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 55):
- return
+ return
# C.g:497:2: ( and_expression ( '^' and_expression )* )
# C.g:497:4: and_expression ( '^' and_expression )*
@@ -8505,7 +8505,7 @@ class CParser(Parser):
self.and_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:497:19: ( '^' and_expression )*
while True: #loop79
alt79 = 2
@@ -8519,12 +8519,12 @@ class CParser(Parser):
# C.g:497:20: '^' and_expression
self.match(self.input, 94, self.FOLLOW_94_in_exclusive_or_expression1920)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_and_expression_in_exclusive_or_expression1922)
self.and_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8544,7 +8544,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end exclusive_or_expression
@@ -8557,7 +8557,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 56):
- return
+ return
# C.g:501:2: ( equality_expression ( '&' equality_expression )* )
# C.g:501:4: equality_expression ( '&' equality_expression )*
@@ -8565,7 +8565,7 @@ class CParser(Parser):
self.equality_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:501:24: ( '&' equality_expression )*
while True: #loop80
alt80 = 2
@@ -8579,12 +8579,12 @@ class CParser(Parser):
# C.g:501:25: '&' equality_expression
self.match(self.input, 77, self.FOLLOW_77_in_and_expression1938)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_equality_expression_in_and_expression1940)
self.equality_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8604,7 +8604,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end and_expression
@@ -8617,7 +8617,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 57):
- return
+ return
# C.g:504:2: ( relational_expression ( ( '==' | '!=' ) relational_expression )* )
# C.g:504:4: relational_expression ( ( '==' | '!=' ) relational_expression )*
@@ -8625,7 +8625,7 @@ class CParser(Parser):
self.relational_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:504:26: ( ( '==' | '!=' ) relational_expression )*
while True: #loop81
alt81 = 2
@@ -8645,7 +8645,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -8658,7 +8658,7 @@ class CParser(Parser):
self.relational_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8678,7 +8678,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end equality_expression
@@ -8691,7 +8691,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 58):
- return
+ return
# C.g:508:2: ( shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )* )
# C.g:508:4: shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )*
@@ -8699,7 +8699,7 @@ class CParser(Parser):
self.shift_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:508:21: ( ( '<' | '>' | '<=' | '>=' ) shift_expression )*
while True: #loop82
alt82 = 2
@@ -8719,7 +8719,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -8732,7 +8732,7 @@ class CParser(Parser):
self.shift_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8752,7 +8752,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end relational_expression
@@ -8765,7 +8765,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 59):
- return
+ return
# C.g:512:2: ( additive_expression ( ( '<<' | '>>' ) additive_expression )* )
# C.g:512:4: additive_expression ( ( '<<' | '>>' ) additive_expression )*
@@ -8773,7 +8773,7 @@ class CParser(Parser):
self.additive_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:512:24: ( ( '<<' | '>>' ) additive_expression )*
while True: #loop83
alt83 = 2
@@ -8793,7 +8793,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -8806,7 +8806,7 @@ class CParser(Parser):
self.additive_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8826,7 +8826,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end shift_expression
@@ -8839,7 +8839,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 60):
- return
+ return
# C.g:518:2: ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration )
alt84 = 11
@@ -8860,7 +8860,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 43, self.input)
@@ -8880,7 +8880,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 47, self.input)
@@ -8896,7 +8896,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 53, self.input)
@@ -8912,7 +8912,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 68, self.input)
@@ -8923,7 +8923,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 1, self.input)
@@ -8952,7 +8952,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 0, self.input)
@@ -8964,7 +8964,7 @@ class CParser(Parser):
self.labeled_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 2:
@@ -8973,7 +8973,7 @@ class CParser(Parser):
self.compound_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 3:
@@ -8982,7 +8982,7 @@ class CParser(Parser):
self.expression_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 4:
@@ -8991,7 +8991,7 @@ class CParser(Parser):
self.selection_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 5:
@@ -9000,7 +9000,7 @@ class CParser(Parser):
self.iteration_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 6:
@@ -9009,7 +9009,7 @@ class CParser(Parser):
self.jump_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 7:
@@ -9018,7 +9018,7 @@ class CParser(Parser):
self.macro_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 8:
@@ -9027,7 +9027,7 @@ class CParser(Parser):
self.asm2_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 9:
@@ -9036,7 +9036,7 @@ class CParser(Parser):
self.asm1_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 10:
@@ -9045,7 +9045,7 @@ class CParser(Parser):
self.asm_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 11:
@@ -9054,7 +9054,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
@@ -9067,7 +9067,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end statement
@@ -9080,7 +9080,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 61):
- return
+ return
# C.g:532:2: ( ( '__asm__' )? IDENTIFIER '(' (~ ( ';' ) )* ')' ';' )
# C.g:532:4: ( '__asm__' )? IDENTIFIER '(' (~ ( ';' ) )* ')' ';'
@@ -9094,16 +9094,16 @@ class CParser(Parser):
# C.g:0:0: '__asm__'
self.match(self.input, 103, self.FOLLOW_103_in_asm2_statement2086)
if self.failed:
- return
+ return
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_asm2_statement2089)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_asm2_statement2091)
if self.failed:
- return
+ return
# C.g:532:30: (~ ( ';' ) )*
while True: #loop86
alt86 = 2
@@ -9130,7 +9130,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -9147,10 +9147,10 @@ class CParser(Parser):
self.match(self.input, 63, self.FOLLOW_63_in_asm2_statement2101)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_asm2_statement2103)
if self.failed:
- return
+ return
@@ -9164,7 +9164,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end asm2_statement
@@ -9177,16 +9177,16 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 62):
- return
+ return
# C.g:536:2: ( '_asm' '{' (~ ( '}' ) )* '}' )
# C.g:536:4: '_asm' '{' (~ ( '}' ) )* '}'
self.match(self.input, 104, self.FOLLOW_104_in_asm1_statement2115)
if self.failed:
- return
+ return
self.match(self.input, 43, self.FOLLOW_43_in_asm1_statement2117)
if self.failed:
- return
+ return
# C.g:536:15: (~ ( '}' ) )*
while True: #loop87
alt87 = 2
@@ -9206,7 +9206,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -9223,7 +9223,7 @@ class CParser(Parser):
self.match(self.input, 44, self.FOLLOW_44_in_asm1_statement2127)
if self.failed:
- return
+ return
@@ -9237,7 +9237,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end asm1_statement
@@ -9250,16 +9250,16 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 63):
- return
+ return
# C.g:540:2: ( '__asm' '{' (~ ( '}' ) )* '}' )
# C.g:540:4: '__asm' '{' (~ ( '}' ) )* '}'
self.match(self.input, 105, self.FOLLOW_105_in_asm_statement2138)
if self.failed:
- return
+ return
self.match(self.input, 43, self.FOLLOW_43_in_asm_statement2140)
if self.failed:
- return
+ return
# C.g:540:16: (~ ( '}' ) )*
while True: #loop88
alt88 = 2
@@ -9279,7 +9279,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -9296,7 +9296,7 @@ class CParser(Parser):
self.match(self.input, 44, self.FOLLOW_44_in_asm_statement2150)
if self.failed:
- return
+ return
@@ -9310,7 +9310,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end asm_statement
@@ -9323,16 +9323,16 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 64):
- return
+ return
# C.g:544:2: ( IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')' )
# C.g:544:4: IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')'
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_macro_statement2162)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_macro_statement2164)
if self.failed:
- return
+ return
# C.g:544:19: ( declaration )*
while True: #loop89
alt89 = 2
@@ -11234,7 +11234,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -12440,7 +12440,7 @@ class CParser(Parser):
self.statement_list()
self.following.pop()
if self.failed:
- return
+ return
@@ -12456,13 +12456,13 @@ class CParser(Parser):
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_macro_statement2176)
if self.failed:
- return
+ return
@@ -12476,7 +12476,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end macro_statement
@@ -12489,7 +12489,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 65):
- return
+ return
# C.g:548:2: ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement )
alt92 = 3
@@ -12503,7 +12503,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("547:1: labeled_statement : ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement );", 92, 0, self.input)
@@ -12513,50 +12513,50 @@ class CParser(Parser):
# C.g:548:4: IDENTIFIER ':' statement
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_labeled_statement2188)
if self.failed:
- return
+ return
self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2190)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_labeled_statement2192)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt92 == 2:
# C.g:549:4: 'case' constant_expression ':' statement
self.match(self.input, 106, self.FOLLOW_106_in_labeled_statement2197)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_labeled_statement2199)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2201)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_labeled_statement2203)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt92 == 3:
# C.g:550:4: 'default' ':' statement
self.match(self.input, 107, self.FOLLOW_107_in_labeled_statement2208)
if self.failed:
- return
+ return
self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2210)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_labeled_statement2212)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
@@ -12569,7 +12569,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end labeled_statement
@@ -14552,7 +14552,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 67):
- return
+ return
# C.g:558:2: ( ( statement )+ )
# C.g:558:4: ( statement )+
@@ -16230,7 +16230,7 @@ class CParser(Parser):
self.statement()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -16239,7 +16239,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(95, self.input)
raise eee
@@ -16260,7 +16260,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end statement_list
@@ -16347,7 +16347,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 69):
- return
+ return
# C.g:567:2: ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement )
alt98 = 2
@@ -16360,7 +16360,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("566:1: selection_statement : ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement );", 98, 0, self.input)
@@ -16370,18 +16370,18 @@ class CParser(Parser):
# C.g:567:4: 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )?
self.match(self.input, 108, self.FOLLOW_108_in_selection_statement2272)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_selection_statement2274)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_selection_statement2278)
e = self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_selection_statement2280)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -16389,7 +16389,7 @@ class CParser(Parser):
self.statement()
self.following.pop()
if self.failed:
- return
+ return
# C.g:567:167: ( options {k=1; backtrack=false; } : 'else' statement )?
alt97 = 2
LA97_0 = self.input.LA(1)
@@ -16400,12 +16400,12 @@ class CParser(Parser):
# C.g:567:200: 'else' statement
self.match(self.input, 109, self.FOLLOW_109_in_selection_statement2299)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_selection_statement2301)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
@@ -16415,23 +16415,23 @@ class CParser(Parser):
# C.g:568:4: 'switch' '(' expression ')' statement
self.match(self.input, 110, self.FOLLOW_110_in_selection_statement2308)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_selection_statement2310)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_selection_statement2312)
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_selection_statement2314)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_selection_statement2316)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
@@ -16444,7 +16444,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end selection_statement
@@ -16460,7 +16460,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 70):
- return
+ return
# C.g:572:2: ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement )
alt100 = 3
@@ -16474,7 +16474,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("571:1: iteration_statement : ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement );", 100, 0, self.input)
@@ -16484,23 +16484,23 @@ class CParser(Parser):
# C.g:572:4: 'while' '(' e= expression ')' statement
self.match(self.input, 111, self.FOLLOW_111_in_iteration_statement2327)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2329)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_iteration_statement2333)
e = self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2335)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_iteration_statement2337)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -16510,29 +16510,29 @@ class CParser(Parser):
# C.g:573:4: 'do' statement 'while' '(' e= expression ')' ';'
self.match(self.input, 112, self.FOLLOW_112_in_iteration_statement2344)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_iteration_statement2346)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 111, self.FOLLOW_111_in_iteration_statement2348)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2350)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_iteration_statement2354)
e = self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2356)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_iteration_statement2358)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -16542,20 +16542,20 @@ class CParser(Parser):
# C.g:574:4: 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement
self.match(self.input, 113, self.FOLLOW_113_in_iteration_statement2365)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2367)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_statement_in_iteration_statement2369)
self.expression_statement()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_statement_in_iteration_statement2373)
e = self.expression_statement()
self.following.pop()
if self.failed:
- return
+ return
# C.g:574:58: ( expression )?
alt99 = 2
LA99_0 = self.input.LA(1)
@@ -16568,18 +16568,18 @@ class CParser(Parser):
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2378)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_iteration_statement2380)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -16595,7 +16595,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end iteration_statement
@@ -16608,7 +16608,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 71):
- return
+ return
# C.g:578:2: ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' )
alt101 = 5
@@ -16629,7 +16629,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 4, self.input)
@@ -16638,7 +16638,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 0, self.input)
@@ -16648,58 +16648,58 @@ class CParser(Parser):
# C.g:578:4: 'goto' IDENTIFIER ';'
self.match(self.input, 114, self.FOLLOW_114_in_jump_statement2393)
if self.failed:
- return
+ return
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_jump_statement2395)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2397)
if self.failed:
- return
+ return
elif alt101 == 2:
# C.g:579:4: 'continue' ';'
self.match(self.input, 115, self.FOLLOW_115_in_jump_statement2402)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2404)
if self.failed:
- return
+ return
elif alt101 == 3:
# C.g:580:4: 'break' ';'
self.match(self.input, 116, self.FOLLOW_116_in_jump_statement2409)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2411)
if self.failed:
- return
+ return
elif alt101 == 4:
# C.g:581:4: 'return' ';'
self.match(self.input, 117, self.FOLLOW_117_in_jump_statement2416)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2418)
if self.failed:
- return
+ return
elif alt101 == 5:
# C.g:582:4: 'return' expression ';'
self.match(self.input, 117, self.FOLLOW_117_in_jump_statement2423)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_jump_statement2425)
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2427)
if self.failed:
- return
+ return
@@ -16712,7 +16712,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end jump_statement
@@ -16724,7 +16724,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred2
@@ -16855,7 +16855,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
@@ -16863,7 +16863,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:119:41: ( declaration )*
while True: #loop103
alt103 = 2
@@ -16879,7 +16879,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -16888,7 +16888,7 @@ class CParser(Parser):
self.match(self.input, 43, self.FOLLOW_43_in_synpred4108)
if self.failed:
- return
+ return
# $ANTLR end synpred4
@@ -16903,7 +16903,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred5
@@ -16918,7 +16918,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred7
@@ -16933,7 +16933,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred10
@@ -16948,7 +16948,7 @@ class CParser(Parser):
self.type_specifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred14
@@ -16963,7 +16963,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred15
@@ -16978,7 +16978,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred33
@@ -16991,7 +16991,7 @@ class CParser(Parser):
# C.g:225:5: IDENTIFIER ( type_qualifier )* declarator
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred34442)
if self.failed:
- return
+ return
# C.g:225:16: ( type_qualifier )*
while True: #loop106
alt106 = 2
@@ -17026,7 +17026,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -17037,7 +17037,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred34
@@ -17052,7 +17052,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred39
@@ -17067,7 +17067,7 @@ class CParser(Parser):
self.type_specifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred40
@@ -17090,7 +17090,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
@@ -17104,7 +17104,7 @@ class CParser(Parser):
# C.g:297:14: 'EFIAPI'
self.match(self.input, 58, self.FOLLOW_58_in_synpred66788)
if self.failed:
- return
+ return
@@ -17118,7 +17118,7 @@ class CParser(Parser):
# C.g:297:26: 'EFI_BOOTSERVICE'
self.match(self.input, 59, self.FOLLOW_59_in_synpred66793)
if self.failed:
- return
+ return
@@ -17132,7 +17132,7 @@ class CParser(Parser):
# C.g:297:47: 'EFI_RUNTIMESERVICE'
self.match(self.input, 60, self.FOLLOW_60_in_synpred66798)
if self.failed:
- return
+ return
@@ -17140,7 +17140,7 @@ class CParser(Parser):
self.direct_declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred66
@@ -17155,7 +17155,7 @@ class CParser(Parser):
self.declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred67
@@ -17168,7 +17168,7 @@ class CParser(Parser):
# C.g:304:9: 'EFIAPI'
self.match(self.input, 58, self.FOLLOW_58_in_synpred69830)
if self.failed:
- return
+ return
# $ANTLR end synpred69
@@ -17183,7 +17183,7 @@ class CParser(Parser):
self.declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred70
@@ -17196,15 +17196,15 @@ class CParser(Parser):
# C.g:310:9: '(' parameter_type_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred73878)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_parameter_type_list_in_synpred73880)
self.parameter_type_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred73882)
if self.failed:
- return
+ return
# $ANTLR end synpred73
@@ -17217,15 +17217,15 @@ class CParser(Parser):
# C.g:311:9: '(' identifier_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred74892)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_identifier_list_in_synpred74894)
self.identifier_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred74896)
if self.failed:
- return
+ return
# $ANTLR end synpred74
@@ -17240,7 +17240,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred75
@@ -17255,7 +17255,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred76
@@ -17268,7 +17268,7 @@ class CParser(Parser):
# C.g:316:4: '*' ( type_qualifier )+ ( pointer )?
self.match(self.input, 66, self.FOLLOW_66_in_synpred77919)
if self.failed:
- return
+ return
# C.g:316:8: ( type_qualifier )+
cnt116 = 0
while True: #loop116
@@ -17285,7 +17285,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -17294,7 +17294,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(116, self.input)
raise eee
@@ -17314,7 +17314,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
@@ -17330,12 +17330,12 @@ class CParser(Parser):
# C.g:317:4: '*' pointer
self.match(self.input, 66, self.FOLLOW_66_in_synpred78930)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_pointer_in_synpred78932)
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred78
@@ -17348,7 +17348,7 @@ class CParser(Parser):
# C.g:326:32: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_synpred81977)
if self.failed:
- return
+ return
# $ANTLR end synpred81
@@ -17361,7 +17361,7 @@ class CParser(Parser):
# C.g:326:27: ',' ( 'OPTIONAL' )? parameter_declaration
self.match(self.input, 27, self.FOLLOW_27_in_synpred82974)
if self.failed:
- return
+ return
# C.g:326:31: ( 'OPTIONAL' )?
alt119 = 2
LA119_0 = self.input.LA(1)
@@ -17375,7 +17375,7 @@ class CParser(Parser):
# C.g:326:32: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_synpred82977)
if self.failed:
- return
+ return
@@ -17383,7 +17383,7 @@ class CParser(Parser):
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred82
@@ -17398,7 +17398,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred83
@@ -17413,7 +17413,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred84
@@ -17428,7 +17428,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# C.g:330:27: ( declarator | abstract_declarator )*
while True: #loop120
alt120 = 3
@@ -17512,7 +17512,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
elif alt120 == 2:
@@ -17521,7 +17521,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -17538,7 +17538,7 @@ class CParser(Parser):
# C.g:330:62: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_synpred861004)
if self.failed:
- return
+ return
@@ -17556,7 +17556,7 @@ class CParser(Parser):
self.specifier_qualifier_list()
self.following.pop()
if self.failed:
- return
+ return
# C.g:341:29: ( abstract_declarator )?
alt122 = 2
LA122_0 = self.input.LA(1)
@@ -17569,7 +17569,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
@@ -17587,7 +17587,7 @@ class CParser(Parser):
self.direct_abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred91
@@ -17600,15 +17600,15 @@ class CParser(Parser):
# C.g:351:6: '(' abstract_declarator ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred931086)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_abstract_declarator_in_synpred931088)
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred931090)
if self.failed:
- return
+ return
# $ANTLR end synpred93
@@ -17623,7 +17623,7 @@ class CParser(Parser):
self.abstract_declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred94
@@ -17636,20 +17636,20 @@ class CParser(Parser):
# C.g:386:4: '(' type_name ')' cast_expression
self.match(self.input, 62, self.FOLLOW_62_in_synpred1091282)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_type_name_in_synpred1091284)
self.type_name()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred1091286)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_synpred1091288)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred109
@@ -17662,12 +17662,12 @@ class CParser(Parser):
# C.g:395:4: 'sizeof' unary_expression
self.match(self.input, 74, self.FOLLOW_74_in_synpred1141330)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_unary_expression_in_synpred1141332)
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred114
@@ -17680,15 +17680,15 @@ class CParser(Parser):
# C.g:409:13: '(' argument_expression_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred1171420)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_argument_expression_list_in_synpred1171424)
self.argument_expression_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred1171428)
if self.failed:
- return
+ return
# $ANTLR end synpred117
@@ -17701,15 +17701,15 @@ class CParser(Parser):
# C.g:410:13: '(' macro_parameter_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred1181444)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_macro_parameter_list_in_synpred1181446)
self.macro_parameter_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred1181448)
if self.failed:
- return
+ return
# $ANTLR end synpred118
@@ -17722,10 +17722,10 @@ class CParser(Parser):
# C.g:412:13: '*' IDENTIFIER
self.match(self.input, 66, self.FOLLOW_66_in_synpred1201482)
if self.failed:
- return
+ return
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred1201486)
if self.failed:
- return
+ return
# $ANTLR end synpred120
@@ -17738,7 +17738,7 @@ class CParser(Parser):
# C.g:443:20: STRING_LITERAL
self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_synpred1371683)
if self.failed:
- return
+ return
# $ANTLR end synpred137
@@ -17762,7 +17762,7 @@ class CParser(Parser):
# C.g:0:0: IDENTIFIER
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred1381680)
if self.failed:
- return
+ return
else:
@@ -17783,7 +17783,7 @@ class CParser(Parser):
# C.g:0:0: STRING_LITERAL
self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_synpred1381683)
if self.failed:
- return
+ return
else:
@@ -17792,7 +17792,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(126, self.input)
raise eee
@@ -17814,17 +17814,17 @@ class CParser(Parser):
self.lvalue()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_assignment_operator_in_synpred1421746)
self.assignment_operator()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_assignment_expression_in_synpred1421748)
self.assignment_expression()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred142
@@ -17839,7 +17839,7 @@ class CParser(Parser):
self.expression_statement()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred169
@@ -17854,7 +17854,7 @@ class CParser(Parser):
self.macro_statement()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred173
@@ -17869,7 +17869,7 @@ class CParser(Parser):
self.asm2_statement()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred174
@@ -17884,7 +17884,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred181
@@ -17899,7 +17899,7 @@ class CParser(Parser):
self.statement_list()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred182
@@ -17914,7 +17914,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred186
@@ -17929,7 +17929,7 @@ class CParser(Parser):
self.statement()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred188
@@ -18388,7 +18388,7 @@ class CParser(Parser):
-
+
FOLLOW_external_declaration_in_translation_unit74 = frozenset([1, 4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
FOLLOW_function_definition_in_external_declaration113 = frozenset([1])
diff --git a/BaseTools/Source/Python/Ecc/Check.py b/BaseTools/Source/Python/Ecc/Check.py
index 27783e617b92..0a27081df37d 100644
--- a/BaseTools/Source/Python/Ecc/Check.py
+++ b/BaseTools/Source/Python/Ecc/Check.py
@@ -563,17 +563,17 @@ class Check(object):
op = open(FullName).readlines()
FileLinesList = op
LineNo = 0
- CurrentSection = MODEL_UNKNOWN
+ CurrentSection = MODEL_UNKNOWN
HeaderSectionLines = []
- HeaderCommentStart = False
+ HeaderCommentStart = False
HeaderCommentEnd = False
-
+
for Line in FileLinesList:
LineNo = LineNo + 1
Line = Line.strip()
if (LineNo < len(FileLinesList) - 1):
NextLine = FileLinesList[LineNo].strip()
-
+
#
# blank line
#
@@ -600,8 +600,8 @@ class Check(object):
#
HeaderSectionLines.append((Line, LineNo))
HeaderCommentStart = True
- continue
-
+ continue
+
#
# Collect Header content.
#
@@ -635,7 +635,7 @@ class Check(object):
if EccGlobalData.gConfig.HeaderCheckFileCommentEnd == '1' or EccGlobalData.gConfig.HeaderCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
-
+
# Check whether the function headers are followed Doxygen special documentation blocks in section 2.3.5
def DoxygenCheckFunctionHeader(self):
@@ -827,7 +827,7 @@ class Check(object):
for FilePath in FilePathList:
if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_LIBRARY_NAME_DUPLICATE, Record[1]):
EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_NAME_DUPLICATE, OtherMsg="The Library Class [%s] is duplicated in '%s' line %s and line %s." % (Record[1], FilePath, Record[3], Record[4]), BelongsToTable='Dsc', BelongsToItem=Record[0])
-
+
# Check the header file in Include\Library directory whether be defined in the package DEC file.
def MetaDataFileCheckLibraryDefinedInDec(self):
if EccGlobalData.gConfig.MetaDataFileCheckLibraryDefinedInDec == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
@@ -842,9 +842,9 @@ class Check(object):
if not LibraryDec:
if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_LIBRARY_NOT_DEFINED, LibraryInInf):
EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_NOT_DEFINED, \
- OtherMsg="The Library Class [%s] in %s line is not defined in the associated package file." % (LibraryInInf, Line),
+ OtherMsg="The Library Class [%s] in %s line is not defined in the associated package file." % (LibraryInInf, Line),
BelongsToTable='Inf', BelongsToItem=ID)
-
+
# Check whether an Inf file is specified in the FDF file, but not in the Dsc file, then the Inf file must be for a Binary module only
def MetaDataFileCheckBinaryInfInFdf(self):
if EccGlobalData.gConfig.MetaDataFileCheckBinaryInfInFdf == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
@@ -1244,7 +1244,7 @@ class Check(object):
group by A.ID
""" % (Table.Table, Table.Table, Model, Model)
RecordSet = Table.Exec(SqlCommand)
- for Record in RecordSet:
+ for Record in RecordSet:
if not EccGlobalData.gException.IsException(ErrorID, Record[2]):
EccGlobalData.gDb.TblReport.Insert(ErrorID, OtherMsg="The %s value [%s] is used more than one time" % (Name.upper(), Record[2]), BelongsToTable=Table.Table, BelongsToItem=Record[0])
diff --git a/BaseTools/Source/Python/Ecc/CodeFragment.py b/BaseTools/Source/Python/Ecc/CodeFragment.py
index 3bf1c4515020..beb29a8203b4 100644
--- a/BaseTools/Source/Python/Ecc/CodeFragment.py
+++ b/BaseTools/Source/Python/Ecc/CodeFragment.py
@@ -161,5 +161,4 @@ class FunctionCalling:
self.FuncName = Name
self.ParamList = Param
self.StartPos = Begin
- self.EndPos = End
-
\ No newline at end of file
+ self.EndPos = End
diff --git a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
index ffa51de7c1bf..3377f4a94003 100644
--- a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
@@ -46,7 +46,7 @@ from ParserWarning import Warning
T_CHAR_BACKSLASH, T_CHAR_DOUBLE_QUOTE, T_CHAR_SINGLE_QUOTE, T_CHAR_STAR, T_CHAR_HASH) = \
(' ', '\0', '\r', '\t', '\n', '/', '\\', '\"', '\'', '*', '#')
-SEPERATOR_TUPLE = ('=', '|', ',', '{', '}')
+SEPERATOR_TUPLE = ('=', '|', ',', '{', '}')
(T_COMMENT_TWO_SLASH, T_COMMENT_SLASH_STAR) = (0, 1)
@@ -58,7 +58,7 @@ SEPERATOR_TUPLE = ('=', '|', ',', '{', '}')
#
# GetNext*** procedures mean these procedures will get next token first, then make judgement.
# Get*** procedures mean these procedures will make judgement on current token only.
-#
+#
class CodeFragmentCollector:
## The constructor
#
@@ -88,7 +88,7 @@ class CodeFragmentCollector:
SizeOfLastLine = NumberOfLines
if NumberOfLines > 0:
SizeOfLastLine = len(self.Profile.FileLinesList[-1])
-
+
if self.CurrentLineNumber == NumberOfLines and self.CurrentOffsetWithinLine >= SizeOfLastLine - 1:
return True
elif self.CurrentLineNumber > NumberOfLines:
@@ -110,7 +110,7 @@ class CodeFragmentCollector:
return True
else:
return False
-
+
## Rewind() method
#
# Reset file data buffer to the initial state
@@ -120,7 +120,7 @@ class CodeFragmentCollector:
def Rewind(self):
self.CurrentLineNumber = 1
self.CurrentOffsetWithinLine = 0
-
+
## __UndoOneChar() method
#
# Go back one char in the file buffer
@@ -128,9 +128,9 @@ class CodeFragmentCollector:
# @param self The object pointer
# @retval True Successfully go back one char
# @retval False Not able to go back one char as file beginning reached
- #
+ #
def __UndoOneChar(self):
-
+
if self.CurrentLineNumber == 1 and self.CurrentOffsetWithinLine == 0:
return False
elif self.CurrentOffsetWithinLine == 0:
@@ -139,13 +139,13 @@ class CodeFragmentCollector:
else:
self.CurrentOffsetWithinLine -= 1
return True
-
+
## __GetOneChar() method
#
# Move forward one char in the file buffer
#
# @param self The object pointer
- #
+ #
def __GetOneChar(self):
if self.CurrentOffsetWithinLine == len(self.Profile.FileLinesList[self.CurrentLineNumber - 1]) - 1:
self.CurrentLineNumber += 1
@@ -159,13 +159,13 @@ class CodeFragmentCollector:
#
# @param self The object pointer
# @retval Char Current char
- #
+ #
def __CurrentChar(self):
CurrentChar = self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine]
# if CurrentChar > 255:
# raise Warning("Non-Ascii char found At Line %d, offset %d" % (self.CurrentLineNumber, self.CurrentOffsetWithinLine), self.FileName, self.CurrentLineNumber)
return CurrentChar
-
+
## __NextChar() method
#
# Get the one char pass the char pointed to by the file buffer pointer
@@ -178,7 +178,7 @@ class CodeFragmentCollector:
return self.Profile.FileLinesList[self.CurrentLineNumber][0]
else:
return self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine + 1]
-
+
## __SetCurrentCharValue() method
#
# Modify the value of current char
@@ -188,7 +188,7 @@ class CodeFragmentCollector:
#
def __SetCurrentCharValue(self, Value):
self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine] = Value
-
+
## __SetCharValue() method
#
# Modify the value of current char
@@ -198,7 +198,7 @@ class CodeFragmentCollector:
#
def __SetCharValue(self, Line, Offset, Value):
self.Profile.FileLinesList[Line - 1][Offset] = Value
-
+
## __CurrentLine() method
#
# Get the list that contains current line contents
@@ -208,7 +208,7 @@ class CodeFragmentCollector:
#
def __CurrentLine(self):
return self.Profile.FileLinesList[self.CurrentLineNumber - 1]
-
+
## __InsertComma() method
#
# Insert ',' to replace PP
@@ -217,24 +217,24 @@ class CodeFragmentCollector:
# @retval List current line contents
#
def __InsertComma(self, Line):
-
-
+
+
if self.Profile.FileLinesList[Line - 1][0] != T_CHAR_HASH:
BeforeHashPart = str(self.Profile.FileLinesList[Line - 1]).split(T_CHAR_HASH)[0]
if BeforeHashPart.rstrip().endswith(T_CHAR_COMMA) or BeforeHashPart.rstrip().endswith(';'):
return
-
+
if Line - 2 >= 0 and str(self.Profile.FileLinesList[Line - 2]).rstrip().endswith(','):
return
-
+
if Line - 2 >= 0 and str(self.Profile.FileLinesList[Line - 2]).rstrip().endswith(';'):
return
-
+
if str(self.Profile.FileLinesList[Line]).lstrip().startswith(',') or str(self.Profile.FileLinesList[Line]).lstrip().startswith(';'):
return
-
+
self.Profile.FileLinesList[Line - 1].insert(self.CurrentOffsetWithinLine, ',')
-
+
## PreprocessFile() method
#
# Preprocess file contents, replace comments with spaces.
@@ -243,7 +243,7 @@ class CodeFragmentCollector:
# !include statement should be expanded at the same FileLinesList[CurrentLineNumber - 1]
#
# @param self The object pointer
- #
+ #
def PreprocessFile(self):
self.Rewind()
@@ -255,14 +255,14 @@ class CodeFragmentCollector:
PPDirectiveObj = None
# HashComment in quoted string " " is ignored.
InString = False
- InCharLiteral = False
-
+ InCharLiteral = False
+
self.Profile.FileLinesList = [list(s) for s in self.Profile.FileLinesListFromFile]
while not self.__EndOfFile():
-
+
if not InComment and self.__CurrentChar() == T_CHAR_DOUBLE_QUOTE:
InString = not InString
-
+
if not InComment and self.__CurrentChar() == T_CHAR_SINGLE_QUOTE:
InCharLiteral = not InCharLiteral
# meet new line, then no longer in a comment for // and '#'
@@ -273,9 +273,9 @@ class CodeFragmentCollector:
PPExtend = True
else:
PPExtend = False
-
+
EndLinePos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
-
+
if InComment and DoubleSlashComment:
InComment = False
DoubleSlashComment = False
@@ -290,17 +290,17 @@ class CodeFragmentCollector:
PPDirectiveObj.EndPos = EndLinePos
FileProfile.PPDirectiveList.append(PPDirectiveObj)
PPDirectiveObj = None
-
+
if InString or InCharLiteral:
CurrentLine = "".join(self.__CurrentLine())
if CurrentLine.rstrip(T_CHAR_LF).rstrip(T_CHAR_CR).endswith(T_CHAR_BACKSLASH):
SlashIndex = CurrentLine.rindex(T_CHAR_BACKSLASH)
self.__SetCharValue(self.CurrentLineNumber, SlashIndex, T_CHAR_SPACE)
-
+
if InComment and not DoubleSlashComment and not HashComment:
CommentObj.Content += T_CHAR_LF
self.CurrentLineNumber += 1
- self.CurrentOffsetWithinLine = 0
+ self.CurrentOffsetWithinLine = 0
# check for */ comment end
elif InComment and not DoubleSlashComment and not HashComment and self.__CurrentChar() == T_CHAR_STAR and self.__NextChar() == T_CHAR_SLASH:
CommentObj.Content += self.__CurrentChar()
@@ -314,7 +314,7 @@ class CodeFragmentCollector:
self.__GetOneChar()
InComment = False
# set comments to spaces
- elif InComment:
+ elif InComment:
if HashComment:
# // follows hash PP directive
if self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_SLASH:
@@ -340,7 +340,7 @@ class CodeFragmentCollector:
# check for '#' comment
elif self.__CurrentChar() == T_CHAR_HASH and not InString and not InCharLiteral:
InComment = True
- HashComment = True
+ HashComment = True
PPDirectiveObj = PP_Directive('', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None)
# check for /* comment start
elif self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_STAR:
@@ -354,9 +354,9 @@ class CodeFragmentCollector:
InComment = True
else:
self.__GetOneChar()
-
+
EndLinePos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
-
+
if InComment and DoubleSlashComment:
CommentObj.EndPos = EndLinePos
FileProfile.CommentList.append(CommentObj)
@@ -377,14 +377,14 @@ class CodeFragmentCollector:
PPDirectiveObj = None
# HashComment in quoted string " " is ignored.
InString = False
- InCharLiteral = False
+ InCharLiteral = False
self.Profile.FileLinesList = [list(s) for s in self.Profile.FileLinesListFromFile]
while not self.__EndOfFile():
-
+
if not InComment and self.__CurrentChar() == T_CHAR_DOUBLE_QUOTE:
InString = not InString
-
+
if not InComment and self.__CurrentChar() == T_CHAR_SINGLE_QUOTE:
InCharLiteral = not InCharLiteral
# meet new line, then no longer in a comment for // and '#'
@@ -395,9 +395,9 @@ class CodeFragmentCollector:
PPExtend = True
else:
PPExtend = False
-
+
EndLinePos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
-
+
if InComment and DoubleSlashComment:
InComment = False
DoubleSlashComment = False
@@ -412,17 +412,17 @@ class CodeFragmentCollector:
PPDirectiveObj.EndPos = EndLinePos
FileProfile.PPDirectiveList.append(PPDirectiveObj)
PPDirectiveObj = None
-
+
if InString or InCharLiteral:
CurrentLine = "".join(self.__CurrentLine())
if CurrentLine.rstrip(T_CHAR_LF).rstrip(T_CHAR_CR).endswith(T_CHAR_BACKSLASH):
SlashIndex = CurrentLine.rindex(T_CHAR_BACKSLASH)
self.__SetCharValue(self.CurrentLineNumber, SlashIndex, T_CHAR_SPACE)
-
+
if InComment and not DoubleSlashComment and not HashComment:
CommentObj.Content += T_CHAR_LF
self.CurrentLineNumber += 1
- self.CurrentOffsetWithinLine = 0
+ self.CurrentOffsetWithinLine = 0
# check for */ comment end
elif InComment and not DoubleSlashComment and not HashComment and self.__CurrentChar() == T_CHAR_STAR and self.__NextChar() == T_CHAR_SLASH:
CommentObj.Content += self.__CurrentChar()
@@ -436,7 +436,7 @@ class CodeFragmentCollector:
self.__GetOneChar()
InComment = False
# set comments to spaces
- elif InComment:
+ elif InComment:
if HashComment:
# // follows hash PP directive
if self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_SLASH:
@@ -462,7 +462,7 @@ class CodeFragmentCollector:
# check for '#' comment
elif self.__CurrentChar() == T_CHAR_HASH and not InString and not InCharLiteral:
InComment = True
- HashComment = True
+ HashComment = True
PPDirectiveObj = PP_Directive('', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None)
# check for /* comment start
elif self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_STAR:
@@ -478,7 +478,7 @@ class CodeFragmentCollector:
self.__GetOneChar()
EndLinePos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
-
+
if InComment and DoubleSlashComment:
CommentObj.EndPos = EndLinePos
FileProfile.CommentList.append(CommentObj)
@@ -506,7 +506,7 @@ class CodeFragmentCollector:
tStream = antlr3.CommonTokenStream(lexer)
parser = CParser(tStream)
parser.translation_unit()
-
+
def ParseFileWithClearedPPDirective(self):
self.PreprocessFileWithClear()
# restore from ListOfList to ListOfString
@@ -519,7 +519,7 @@ class CodeFragmentCollector:
tStream = antlr3.CommonTokenStream(lexer)
parser = CParser(tStream)
parser.translation_unit()
-
+
def CleanFileProfileBuffer(self):
FileProfile.CommentList = []
FileProfile.PPDirectiveList = []
@@ -530,61 +530,61 @@ class CodeFragmentCollector:
FileProfile.StructUnionDefinitionList = []
FileProfile.TypedefDefinitionList = []
FileProfile.FunctionCallingList = []
-
+
def PrintFragments(self):
-
+
print '################# ' + self.FileName + '#####################'
-
+
print '/****************************************/'
print '/*************** COMMENTS ***************/'
print '/****************************************/'
for comment in FileProfile.CommentList:
print str(comment.StartPos) + comment.Content
-
+
print '/****************************************/'
print '/********* PREPROCESS DIRECTIVES ********/'
print '/****************************************/'
for pp in FileProfile.PPDirectiveList:
print str(pp.StartPos) + pp.Content
-
+
print '/****************************************/'
print '/********* VARIABLE DECLARATIONS ********/'
print '/****************************************/'
for var in FileProfile.VariableDeclarationList:
print str(var.StartPos) + var.Modifier + ' '+ var.Declarator
-
+
print '/****************************************/'
print '/********* FUNCTION DEFINITIONS *********/'
print '/****************************************/'
for func in FileProfile.FunctionDefinitionList:
print str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos)
-
+
print '/****************************************/'
print '/************ ENUMERATIONS **************/'
print '/****************************************/'
for enum in FileProfile.EnumerationDefinitionList:
print str(enum.StartPos) + enum.Content
-
+
print '/****************************************/'
print '/*********** STRUCTS/UNIONS *************/'
print '/****************************************/'
for su in FileProfile.StructUnionDefinitionList:
print str(su.StartPos) + su.Content
-
+
print '/****************************************/'
print '/********* PREDICATE EXPRESSIONS ********/'
print '/****************************************/'
for predexp in FileProfile.PredicateExpressionList:
print str(predexp.StartPos) + predexp.Content
-
- print '/****************************************/'
+
+ print '/****************************************/'
print '/************** TYPEDEFS ****************/'
print '/****************************************/'
for typedef in FileProfile.TypedefDefinitionList:
print str(typedef.StartPos) + typedef.ToType
-
+
if __name__ == "__main__":
-
+
collector = CodeFragmentCollector(sys.argv[1])
collector.PreprocessFile()
print "For Test."
diff --git a/BaseTools/Source/Python/Ecc/Configuration.py b/BaseTools/Source/Python/Ecc/Configuration.py
index b523858e1b1f..818c4c641c74 100644
--- a/BaseTools/Source/Python/Ecc/Configuration.py
+++ b/BaseTools/Source/Python/Ecc/Configuration.py
@@ -111,7 +111,7 @@ class Configuration(object):
self.HeaderCheckCFileCommentReferenceFormat = 1
# Check whether C File header Comment have the License immediately after the ""Copyright"" line
self.HeaderCheckCFileCommentLicenseFormat = 1
-
+
## C Function Layout Checking
self.CFunctionLayoutCheckAll = 0
@@ -248,7 +248,7 @@ class Configuration(object):
self.MetaDataFileCheckModuleFilePpiFormat = 1
# Check Pcd Format in INF files
self.MetaDataFileCheckModuleFilePcdFormat = 1
-
+
# Check UNI file
self.UniCheckAll = 0
# Check INF or DEC file whether defined the localized information in the associated UNI file.
@@ -270,16 +270,16 @@ class Configuration(object):
# The directory listed here will not be parsed, split with ','
self.SkipDirList = []
-
+
# The file listed here will not be parsed, split with ','
self.SkipFileList = []
# A list for binary file ext name
self.BinaryExtList = []
-
+
# A list for only scanned folders
self.ScanOnlyDirList = []
-
+
# A list for Copyright format
self.Copyright = []
diff --git a/BaseTools/Source/Python/Ecc/Ecc.py b/BaseTools/Source/Python/Ecc/Ecc.py
index 60dfc00260f1..7760ae1359d5 100644
--- a/BaseTools/Source/Python/Ecc/Ecc.py
+++ b/BaseTools/Source/Python/Ecc/Ecc.py
@@ -66,17 +66,17 @@ class Ecc(object):
# Parse the options and args
self.ParseOption()
EdkLogger.info(time.strftime("%H:%M:%S, %b.%d %Y ", time.localtime()) + "[00:00]" + "\n")
-
+
#
# Check EFI_SOURCE (Edk build convention). EDK_SOURCE will always point to ECP
#
WorkspaceDir = os.path.normcase(os.path.normpath(os.environ["WORKSPACE"]))
os.environ["WORKSPACE"] = WorkspaceDir
-
+
# set multiple workspace
PackagesPath = os.getenv("PACKAGES_PATH")
mws.setWs(WorkspaceDir, PackagesPath)
-
+
if "ECP_SOURCE" not in os.environ:
os.environ["ECP_SOURCE"] = mws.join(WorkspaceDir, GlobalData.gEdkCompatibilityPkg)
if "EFI_SOURCE" not in os.environ:
@@ -90,11 +90,11 @@ class Ecc(object):
EfiSourceDir = os.path.normcase(os.path.normpath(os.environ["EFI_SOURCE"]))
EdkSourceDir = os.path.normcase(os.path.normpath(os.environ["EDK_SOURCE"]))
EcpSourceDir = os.path.normcase(os.path.normpath(os.environ["ECP_SOURCE"]))
-
+
os.environ["EFI_SOURCE"] = EfiSourceDir
os.environ["EDK_SOURCE"] = EdkSourceDir
os.environ["ECP_SOURCE"] = EcpSourceDir
-
+
GlobalData.gWorkspace = WorkspaceDir
GlobalData.gEfiSource = EfiSourceDir
GlobalData.gEdkSource = EdkSourceDir
@@ -104,7 +104,7 @@ class Ecc(object):
GlobalData.gGlobalDefines["EFI_SOURCE"] = EfiSourceDir
GlobalData.gGlobalDefines["EDK_SOURCE"] = EdkSourceDir
GlobalData.gGlobalDefines["ECP_SOURCE"] = EcpSourceDir
-
+
EdkLogger.info("Loading ECC configuration ... done")
# Generate checkpoints list
EccGlobalData.gConfig = Configuration(self.ConfigFile)
@@ -120,11 +120,11 @@ class Ecc(object):
# Get files real name in workspace dir
#
GlobalData.gAllFiles = DirCache(GlobalData.gWorkspace)
-
+
# Build ECC database
# self.BuildDatabase()
self.DetectOnlyScanDirs()
-
+
# Start to check
self.Check()
@@ -160,8 +160,8 @@ class Ecc(object):
EdkLogger.error("ECC", BuildToolError.OPTION_VALUE_INVALID, ExtraData="Use -f option need to fill specific folders in config.ini file")
else:
self.BuildDatabase()
-
-
+
+
## BuildDatabase
#
# Build the database for target
@@ -172,7 +172,7 @@ class Ecc(object):
EccGlobalData.gDb.TblReport.Create()
# Build database
- if self.IsInit:
+ if self.IsInit:
if self.ScanMetaData:
EdkLogger.quiet("Building database for Meta Data File ...")
self.BuildMetaDataFileDatabase(SpeciDirs)
@@ -198,7 +198,7 @@ class Ecc(object):
if SpecificDirs is None:
ScanFolders.append(EccGlobalData.gTarget)
else:
- for specificDir in SpecificDirs:
+ for specificDir in SpecificDirs:
ScanFolders.append(os.path.join(EccGlobalData.gTarget, specificDir))
EdkLogger.quiet("Building database for meta data files ...")
Op = open(EccGlobalData.gConfig.MetaDataFileCheckPathOfGenerateFileList, 'w+')
@@ -219,7 +219,7 @@ class Ecc(object):
# symlinks to directories are treated as directories
Dirs.remove(Dir)
Dirs.append(Dirname)
-
+
for File in Files:
if len(File) > 4 and File[-4:].upper() == ".DEC":
Filename = os.path.normpath(os.path.join(Root, File))
diff --git a/BaseTools/Source/Python/Ecc/Exception.py b/BaseTools/Source/Python/Ecc/Exception.py
index b0882afa6289..ef96264ab203 100644
--- a/BaseTools/Source/Python/Ecc/Exception.py
+++ b/BaseTools/Source/Python/Ecc/Exception.py
@@ -23,12 +23,12 @@ class ExceptionXml(object):
self.KeyWord = ''
self.ErrorID = ''
self.FilePath = ''
-
+
def FromXml(self, Item, Key):
self.KeyWord = XmlElement(Item, '%s/KeyWord' % Key)
self.ErrorID = XmlElement(Item, '%s/ErrorID' % Key)
self.FilePath = os.path.normpath(XmlElement(Item, '%s/FilePath' % Key))
-
+
def __str__(self):
return 'ErrorID = %s KeyWord = %s FilePath = %s' %(self.ErrorID, self.KeyWord, self.FilePath)
@@ -36,22 +36,22 @@ class ExceptionXml(object):
class ExceptionListXml(object):
def __init__(self):
self.List = []
-
+
def FromXmlFile(self, FilePath):
XmlContent = XmlParseFile(FilePath)
for Item in XmlList(XmlContent, '/ExceptionList/Exception'):
Exp = ExceptionXml()
Exp.FromXml(Item, 'Exception')
self.List.append(Exp)
-
+
def ToList(self):
RtnList = []
for Item in self.List:
#RtnList.append((Item.ErrorID, Item.KeyWord, Item.FilePath))
RtnList.append((Item.ErrorID, Item.KeyWord))
-
+
return RtnList
-
+
def __str__(self):
RtnStr = ''
if self.List:
@@ -70,7 +70,7 @@ class ExceptionCheck(object):
if FilePath and os.path.isfile(FilePath):
self.ExceptionListXml.FromXmlFile(FilePath)
self.ExceptionList = self.ExceptionListXml.ToList()
-
+
def IsException(self, ErrorID, KeyWord, FileID=-1):
if (str(ErrorID), KeyWord.replace('\r\n', '\n')) in self.ExceptionList:
return True
diff --git a/BaseTools/Source/Python/Ecc/FileProfile.py b/BaseTools/Source/Python/Ecc/FileProfile.py
index f31d37ff9683..4220a75a219e 100644
--- a/BaseTools/Source/Python/Ecc/FileProfile.py
+++ b/BaseTools/Source/Python/Ecc/FileProfile.py
@@ -36,7 +36,7 @@ FunctionCallingList = []
# May raise Exception when opening file.
#
class FileProfile :
-
+
## The constructor
#
# @param self The object pointer
@@ -54,5 +54,4 @@ class FileProfile :
except IOError:
raise Warning("Error when opening file %s" % FileName)
-
-
\ No newline at end of file
+
diff --git a/BaseTools/Source/Python/Ecc/MetaDataParser.py b/BaseTools/Source/Python/Ecc/MetaDataParser.py
index 82ede3eb330c..b6d88c7b15a0 100644
--- a/BaseTools/Source/Python/Ecc/MetaDataParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaDataParser.py
@@ -87,16 +87,16 @@ def GetTableList(FileModelList, Table, Db):
# @param FileName: FileName of the comment
#
def ParseHeaderCommentSection(CommentList, FileName = None):
-
+
Abstract = ''
Description = ''
Copyright = ''
License = ''
EndOfLine = "\n"
STR_HEADER_COMMENT_START = "@file"
-
+
#
- # used to indicate the state of processing header comment section of dec,
+ # used to indicate the state of processing header comment section of dec,
# inf files
#
HEADER_COMMENT_NOT_STARTED = -1
@@ -117,11 +117,11 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
if _IsCopyrightLine(Line):
Last = Index
break
-
+
for Item in CommentList:
Line = Item[0]
LineNo = Item[1]
-
+
if not Line.startswith('#') and Line:
SqlStatement = """ select ID from File where FullPath like '%s'""" % FileName
ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
@@ -131,14 +131,14 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
Comment = CleanString2(Line)[1]
Comment = Comment.strip()
#
- # if there are blank lines between License or Description, keep them as they would be
+ # if there are blank lines between License or Description, keep them as they would be
# indication of different block; or in the position that Abstract should be, also keep it
# as it indicates that no abstract
#
if not Comment and HeaderCommentStage not in [HEADER_COMMENT_LICENSE, \
HEADER_COMMENT_DESCRIPTION, HEADER_COMMENT_ABSTRACT]:
continue
-
+
if HeaderCommentStage == HEADER_COMMENT_NOT_STARTED:
if Comment.startswith(STR_HEADER_COMMENT_START):
HeaderCommentStage = HEADER_COMMENT_ABSTRACT
@@ -152,39 +152,39 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
if not Comment:
Abstract = ''
HeaderCommentStage = HEADER_COMMENT_DESCRIPTION
- elif _IsCopyrightLine(Comment):
+ elif _IsCopyrightLine(Comment):
Copyright += Comment + EndOfLine
HeaderCommentStage = HEADER_COMMENT_COPYRIGHT
- else:
+ else:
Abstract += Comment + EndOfLine
HeaderCommentStage = HEADER_COMMENT_DESCRIPTION
elif HeaderCommentStage == HEADER_COMMENT_DESCRIPTION:
#
# in case there is no description
- #
- if _IsCopyrightLine(Comment):
+ #
+ if _IsCopyrightLine(Comment):
Copyright += Comment + EndOfLine
HeaderCommentStage = HEADER_COMMENT_COPYRIGHT
else:
- Description += Comment + EndOfLine
+ Description += Comment + EndOfLine
elif HeaderCommentStage == HEADER_COMMENT_COPYRIGHT:
- if _IsCopyrightLine(Comment):
+ if _IsCopyrightLine(Comment):
Copyright += Comment + EndOfLine
else:
#
# Contents after copyright line are license, those non-copyright lines in between
- # copyright line will be discarded
+ # copyright line will be discarded
#
if LineNo > Last:
if License:
License += EndOfLine
License += Comment + EndOfLine
- HeaderCommentStage = HEADER_COMMENT_LICENSE
+ HeaderCommentStage = HEADER_COMMENT_LICENSE
else:
if not Comment and not License:
continue
License += Comment + EndOfLine
-
+
if not Copyright.strip():
SqlStatement = """ select ID from File where FullPath like '%s'""" % FileName
ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
@@ -198,19 +198,19 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
for Result in ResultSet:
Msg = 'Header comment section must have license information'
EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
-
+
if not Abstract.strip() or Abstract.find('Component description file') > -1:
SqlStatement = """ select ID from File where FullPath like '%s'""" % FileName
ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
for Result in ResultSet:
Msg = 'Header comment section must have Abstract information.'
EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
-
+
return Abstract.strip(), Description.strip(), Copyright.strip(), License.strip()
## _IsCopyrightLine
-# check whether current line is copyright line, the criteria is whether there is case insensitive keyword "Copyright"
-# followed by zero or more white space characters followed by a "(" character
+# check whether current line is copyright line, the criteria is whether there is case insensitive keyword "Copyright"
+# followed by zero or more white space characters followed by a "(" character
#
# @param LineContent: the line need to be checked
# @return: True if current line is copyright line, False else
@@ -218,11 +218,11 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
def _IsCopyrightLine (LineContent):
LineContent = LineContent.upper()
Result = False
-
+
ReIsCopyrightRe = re.compile(r"""(^|\s)COPYRIGHT *\(""", re.DOTALL)
if ReIsCopyrightRe.search(LineContent):
Result = True
-
+
return Result
@@ -232,7 +232,7 @@ def _IsCopyrightLine (LineContent):
# Remove spaces
#
# @param Line: The string to be cleaned
-# @param CommentCharacter: Comment char, used to ignore comment content,
+# @param CommentCharacter: Comment char, used to ignore comment content,
# default is DataType.TAB_COMMENT_SPLIT
#
def CleanString2(Line, CommentCharacter='#', AllowCppStyleComment=False):
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 4d61cd1cea91..659997045bc0 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -92,7 +92,7 @@ def ParseMacro(Parser):
elif (Name in self._FileLocalMacros) and (self._FileLocalMacros[Name] != Value):
EdkLogger.error('Parser', FORMAT_INVALID, "EDK_GLOBAL defined a macro with the same name and different value as one defined by 'DEFINE'",
ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
-
+
self._ValueList = [Type, Name, Value]
return MacroParser
@@ -334,7 +334,7 @@ class MetaFileParser(object):
self._ValueList = [ReplaceMacro(Value, self._Macros) for Value in self._ValueList]
Name, Value = self._ValueList[1], self._ValueList[2]
- # Sometimes, we need to make differences between EDK and EDK2 modules
+ # Sometimes, we need to make differences between EDK and EDK2 modules
if Name == 'INF_VERSION':
try:
self._Version = int(Value, 0)
@@ -354,7 +354,7 @@ class MetaFileParser(object):
UniFile = os.path.join(os.path.dirname(self.MetaFile), Value)
if os.path.exists(UniFile):
self._UniObj = UniParser(UniFile, IsExtraUni=False, IsModuleUni=False)
-
+
if type(self) == InfParser and self._Version < 0x00010005:
# EDK module allows using defines as macros
self._FileLocalMacros[Name] = Value
@@ -390,7 +390,7 @@ class MetaFileParser(object):
return Macros
- ## Get section Macros that are applicable to current line, which may come from other sections
+ ## Get section Macros that are applicable to current line, which may come from other sections
## that share the same name while scope is wider
def _GetApplicableSectionMacro(self):
Macros = {}
@@ -473,7 +473,7 @@ class InfParser(MetaFileParser):
self.FileID = FileID
else:
self.FileID = self.TblFile.InsertFile(Filename, MODEL_FILE_INF)
-
+
# parse the file line by line
IsFindBlockComment = False
@@ -591,7 +591,7 @@ class InfParser(MetaFileParser):
)
Usage = ''
if IsFindBlockComment:
- EdkLogger.error("Parser", FORMAT_INVALID, "Open block comments (starting with /*) are expected to end with */",
+ EdkLogger.error("Parser", FORMAT_INVALID, "Open block comments (starting with /*) are expected to end with */",
File=self.MetaFile)
self._Done()
@@ -818,7 +818,7 @@ class DscParser(MetaFileParser):
# the owner item
#
self._IdMapping = {-1:-1}
-
+
self.TblFile = EccGlobalData.gDb.TblFile
self.FileID = -1
@@ -838,8 +838,8 @@ class DscParser(MetaFileParser):
self.FileID = FileID
else:
self.FileID = self.TblFile.InsertFile(Filename, MODEL_FILE_DSC)
-
-
+
+
for Index in range(0, len(Content)):
Line = CleanString(Content[Index])
# skip empty line
@@ -850,7 +850,7 @@ class DscParser(MetaFileParser):
self._LineIndex = Index
if self._InSubsection and self._Owner[-1] == -1:
self._Owner.append(self._LastItem)
-
+
# section header
if Line[0] == TAB_SECTION_START and Line[-1] == TAB_SECTION_END:
self._SectionType = MODEL_META_DATA_SECTION_HEADER
@@ -960,7 +960,7 @@ class DscParser(MetaFileParser):
elif self._From > 0:
EdkLogger.error('Parser', FORMAT_INVALID,
"No '!include' allowed in included file",
- ExtraData=self._CurrentLine, File=self.MetaFile,
+ ExtraData=self._CurrentLine, File=self.MetaFile,
Line=self._LineIndex+1)
#
@@ -1154,7 +1154,7 @@ class DscParser(MetaFileParser):
MODEL_META_DATA_USER_EXTENSION : self._Skip,
MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR : self._Skip,
}
-
+
self._RawTable = self._Table
self._Table = MetaFileStorage(self._RawTable.Cur, self.MetaFile, MODEL_FILE_DSC, True)
self._DirectiveStack = []
@@ -1184,7 +1184,7 @@ class DscParser(MetaFileParser):
try:
Processer[self._ItemType]()
except EvaluationException, Excpt:
- #
+ #
# Only catch expression evaluation error here. We need to report
# the precise number of line on which the error occurred
#
@@ -1194,11 +1194,11 @@ class DscParser(MetaFileParser):
# Line=self._LineIndex+1)
except MacroException, Excpt:
EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
- File=self._FileWithError, ExtraData=' '.join(self._ValueList),
+ File=self._FileWithError, ExtraData=' '.join(self._ValueList),
Line=self._LineIndex+1)
if self._ValueList is None:
- continue
+ continue
NewOwner = self._IdMapping.get(Owner, -1)
self._Enabled = int((not self._DirectiveEvalStack) or (False not in self._DirectiveEvalStack))
@@ -1221,7 +1221,7 @@ class DscParser(MetaFileParser):
self._IdMapping[Id] = self._LastItem
RecordList = self._Table.GetAll()
-
+
self._RawTable.Drop()
self._Table.Drop()
for Record in RecordList:
@@ -1255,7 +1255,7 @@ class DscParser(MetaFileParser):
# Don't use PCD with different values.
if Name in self._Symbols and self._Symbols[Name] != Value:
self._Symbols.pop(Name)
- continue
+ continue
self._Symbols[Name] = Value
Records = self._RawTable.Query(MODEL_PCD_FIXED_AT_BUILD, BelongsToItem=-1.0)
@@ -1263,12 +1263,12 @@ class DscParser(MetaFileParser):
Value, DatumType, MaxDatumSize = AnalyzePcdData(Value)
# Only use PCD whose value is straitforward (no macro and PCD)
if self.SymbolPattern.findall(Value):
- continue
+ continue
Name = TokenSpaceGuid+'.'+PcdName
# Don't use PCD with different values.
if Name in self._Symbols and self._Symbols[Name] != Value:
self._Symbols.pop(Name)
- continue
+ continue
self._Symbols[Name] = Value
def __ProcessDefine(self):
@@ -1288,13 +1288,13 @@ class DscParser(MetaFileParser):
SectionLocalMacros[Name] = Value
elif self._ItemType == MODEL_META_DATA_GLOBAL_DEFINE:
GlobalData.gEdkGlobal[Name] = Value
-
+
#
# Keyword in [Defines] section can be used as Macros
#
if (self._ItemType == MODEL_META_DATA_HEADER) and (self._SectionType == MODEL_META_DATA_HEADER):
self._FileLocalMacros[Name] = Value
-
+
self._ValueList = [Type, Name, Value]
def __ProcessDirective(self):
@@ -1309,12 +1309,12 @@ class DscParser(MetaFileParser):
EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
Result = False
except WrnExpression, Excpt:
- #
+ #
# Catch expression evaluation warning here. We need to report
# the precise number of line and return the evaluation result
#
EdkLogger.warn('Parser', "Suspicious expression: %s" % str(Excpt),
- File=self._FileWithError, ExtraData=' '.join(self._ValueList),
+ File=self._FileWithError, ExtraData=' '.join(self._ValueList),
Line=self._LineIndex+1)
Result = Excpt.result
except BadExpression, Exc:
@@ -1365,14 +1365,14 @@ class DscParser(MetaFileParser):
#
elif "ECP_SOURCE" in GlobalData.gCommandLineDefines.keys():
__IncludeMacros['ECP_SOURCE'] = GlobalData.gCommandLineDefines['ECP_SOURCE']
-
+
__IncludeMacros['EFI_SOURCE'] = GlobalData.gGlobalDefines['EFI_SOURCE']
__IncludeMacros['EDK_SOURCE'] = GlobalData.gGlobalDefines['EDK_SOURCE']
#
- # Allow using MACROs comes from [Defines] section to keep compatible.
+ # Allow using MACROs comes from [Defines] section to keep compatible.
#
__IncludeMacros.update(self._Macros)
-
+
IncludedFile = NormPath(ReplaceMacro(self._ValueList[1], __IncludeMacros, RaiseError=True))
#
# First search the include file under the same directory as DSC file
@@ -1386,14 +1386,14 @@ class DscParser(MetaFileParser):
IncludedFile1 = PathClass(IncludedFile, GlobalData.gWorkspace)
ErrorCode, ErrorInfo2 = IncludedFile1.Validate()
if ErrorCode != 0:
- EdkLogger.error('parser', ErrorCode, File=self._FileWithError,
+ EdkLogger.error('parser', ErrorCode, File=self._FileWithError,
Line=self._LineIndex+1, ExtraData=ErrorInfo1 + "\n"+ ErrorInfo2)
self._FileWithError = IncludedFile1
IncludedFileTable = MetaFileStorage(self._Table.Cur, IncludedFile1, MODEL_FILE_DSC, True)
Owner = self._Content[self._ContentIndex-1][0]
- Parser = DscParser(IncludedFile1, self._FileType, IncludedFileTable,
+ Parser = DscParser(IncludedFile1, self._FileType, IncludedFileTable,
Owner=Owner, From=Owner)
# set the parser status with current status
@@ -1417,7 +1417,7 @@ class DscParser(MetaFileParser):
self._Content.pop(self._ContentIndex-1)
self._ValueList = None
self._ContentIndex -= 1
-
+
def __ProcessSkuId(self):
self._ValueList = [ReplaceMacro(Value, self._Macros, RaiseError=True)
for Value in self._ValueList]
@@ -1434,22 +1434,22 @@ class DscParser(MetaFileParser):
# PCD value can be an expression
#
if len(ValueList) > 1 and ValueList[1] == TAB_VOID:
- PcdValue = ValueList[0]
+ PcdValue = ValueList[0]
try:
ValueList[0] = ValueExpression(PcdValue, self._Macros)(True)
except WrnExpression, Value:
- ValueList[0] = Value.result
+ ValueList[0] = Value.result
else:
PcdValue = ValueList[-1]
try:
ValueList[-1] = ValueExpression(PcdValue, self._Macros)(True)
except WrnExpression, Value:
ValueList[-1] = Value.result
-
+
if ValueList[-1] == 'True':
ValueList[-1] = '1'
if ValueList[-1] == 'False':
- ValueList[-1] = '0'
+ ValueList[-1] = '0'
self._ValueList[2] = '|'.join(ValueList)
@@ -1548,7 +1548,7 @@ class DecParser(MetaFileParser):
self.FileID = FileID
else:
self.FileID = self.TblFile.InsertFile(Filename, MODEL_FILE_DEC)
-
+
for Index in range(0, len(Content)):
Line, Comment = CleanString2(Content[Index])
self._CurrentLine = Line
@@ -1750,19 +1750,19 @@ class DecParser(MetaFileParser):
" (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
File=self.MetaFile, Line=self._LineIndex+1)
-
+
ValueRe = re.compile(r'^\s*L?\".*\|.*\"')
PtrValue = ValueRe.findall(TokenList[1])
-
- # Has VOID* type string, may contain "|" character in the string.
+
+ # Has VOID* type string, may contain "|" character in the string.
if len(PtrValue) != 0:
ptrValueList = re.sub(ValueRe, '', TokenList[1])
ValueList = GetSplitValueList(ptrValueList)
ValueList[0] = PtrValue[0]
else:
ValueList = GetSplitValueList(TokenList[1])
-
-
+
+
# check if there's enough datum information given
if len(ValueList) != 3:
EdkLogger.error('Parser', FORMAT_INVALID, "Invalid PCD Datum information given",
@@ -1792,7 +1792,7 @@ class DecParser(MetaFileParser):
if not IsValid:
EdkLogger.error('Parser', FORMAT_INVALID, Cause, ExtraData=self._CurrentLine,
File=self.MetaFile, Line=self._LineIndex+1)
-
+
if EccGlobalData.gConfig.UniCheckPCDInfo == '1' or EccGlobalData.gConfig.UniCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
# check Description, Prompt information
PatternDesc = re.compile('##\s*([\x21-\x7E\s]*)', re.S)
@@ -1903,7 +1903,7 @@ class DecParser(MetaFileParser):
## Fdf
#
# This class defined the structure used in Fdf object
-#
+#
# @param Filename: Input value for Ffilename of Fdf file, default is None
# @param WorkspaceDir: Input value for current workspace directory, default is None
#
@@ -1911,7 +1911,7 @@ class Fdf(object):
def __init__(self, Filename = None, IsToDatabase = False, WorkspaceDir = None, Database = None):
self.WorkspaceDir = WorkspaceDir
self.IsToDatabase = IsToDatabase
-
+
self.Cur = Database.Cur
self.TblFile = Database.TblFile
self.TblFdf = Database.TblFdf
@@ -1938,15 +1938,15 @@ class Fdf(object):
self.FileList[Filename] = FileID
return self.FileList[Filename]
-
-
+
+
## Load Fdf file
#
# Load the file if it exists
#
# @param Filename: Input value for filename of Fdf file
#
- def LoadFdfFile(self, Filename):
+ def LoadFdfFile(self, Filename):
FileList = []
#
# Parse Fdf file
@@ -1991,7 +1991,7 @@ class UniParser(object):
self.FileIn = None
self.Missing = []
self.__read()
-
+
def __read(self):
try:
self.FileIn = CodecOpenLongFilePath(self.FilePath, Mode='rb', Encoding='utf_8').read()
@@ -2001,7 +2001,7 @@ class UniParser(object):
self.FileIn = CodecOpenLongFilePath(self.FilePath, Mode='rb', Encoding='utf_16_le').read()
except IOError:
self.FileIn = ""
-
+
def Start(self):
if self.IsModuleUni:
if self.IsExtraUni:
@@ -2021,7 +2021,7 @@ class UniParser(object):
self.PrintLog('STR_PACKAGE_ABSTRACT', PackageAbstract)
PackageDescription = self.CheckKeyValid('STR_PACKAGE_DESCRIPTION')
self.PrintLog('STR_PACKAGE_DESCRIPTION', PackageDescription)
-
+
def CheckKeyValid(self, Key, Contents=None):
if not Contents:
Contents = self.FileIn
@@ -2029,7 +2029,7 @@ class UniParser(object):
if KeyPattern.search(Contents):
return True
return False
-
+
def CheckPcdInfo(self, PcdCName):
PromptKey = 'STR_%s_PROMPT' % PcdCName.replace('.', '_')
PcdPrompt = self.CheckKeyValid(PromptKey)
@@ -2037,7 +2037,7 @@ class UniParser(object):
HelpKey = 'STR_%s_HELP' % PcdCName.replace('.', '_')
PcdHelp = self.CheckKeyValid(HelpKey)
self.PrintLog(HelpKey, PcdHelp)
-
+
def PrintLog(self, Key, Value):
if not Value and Key not in self.Missing:
Msg = '%s is missing in the %s file.' % (Key, self.FileName)
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
index 9faa6b58b001..5376437e3d13 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
@@ -25,7 +25,7 @@ from CommonDataClass.DataClass import MODEL_FILE_DSC, MODEL_FILE_DEC, MODEL_FILE
MODEL_FILE_OTHERS
class MetaFileTable(Table):
- ## Constructor
+ ## Constructor
def __init__(self, Cursor, MetaFile, FileType, TableName, Temporary = False):
self.MetaFile = MetaFile
self.TblFile = EccGlobalData.gDb.TblFile
@@ -88,30 +88,30 @@ class ModuleTable(MetaFileTable):
BelongsToItem=-1, BelongsToFile = -1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0, Usage=''):
(Value1, Value2, Value3, Usage, Scope1, Scope2) = ConvertToSqlString((Value1, Value2, Value3, Usage, Scope1, Scope2))
return Table.Insert(
- self,
- Model,
- Value1,
- Value2,
- Value3,
- Usage,
- Scope1,
+ self,
+ Model,
+ Value1,
+ Value2,
+ Value3,
+ Usage,
+ Scope1,
Scope2,
BelongsToItem,
- BelongsToFile,
- StartLine,
- StartColumn,
- EndLine,
- EndColumn,
+ BelongsToFile,
+ StartLine,
+ StartColumn,
+ EndLine,
+ EndColumn,
Enabled
)
## Query table
#
- # @param Model: The Model of Record
- # @param Arch: The Arch attribute of Record
- # @param Platform The Platform attribute of Record
+ # @param Model: The Model of Record
+ # @param Arch: The Arch attribute of Record
+ # @param Platform The Platform attribute of Record
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model, Arch=None, Platform=None):
ConditionString = "Model=%s AND Enabled>=0" % Model
@@ -171,28 +171,28 @@ class PackageTable(MetaFileTable):
BelongsToItem=-1, BelongsToFile = -1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0):
(Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2))
return Table.Insert(
- self,
- Model,
- Value1,
- Value2,
- Value3,
- Scope1,
+ self,
+ Model,
+ Value1,
+ Value2,
+ Value3,
+ Scope1,
Scope2,
BelongsToItem,
- BelongsToFile,
- StartLine,
- StartColumn,
- EndLine,
- EndColumn,
+ BelongsToFile,
+ StartLine,
+ StartColumn,
+ EndLine,
+ EndColumn,
Enabled
)
## Query table
#
- # @param Model: The Model of Record
- # @param Arch: The Arch attribute of Record
+ # @param Model: The Model of Record
+ # @param Arch: The Arch attribute of Record
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model, Arch=None):
ConditionString = "Model=%s AND Enabled>=0" % Model
@@ -252,32 +252,32 @@ class PlatformTable(MetaFileTable):
FromItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=1):
(Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2))
return Table.Insert(
- self,
- Model,
- Value1,
- Value2,
- Value3,
- Scope1,
+ self,
+ Model,
+ Value1,
+ Value2,
+ Value3,
+ Scope1,
Scope2,
- BelongsToItem,
+ BelongsToItem,
BelongsToFile,
FromItem,
- StartLine,
- StartColumn,
- EndLine,
- EndColumn,
+ StartLine,
+ StartColumn,
+ EndLine,
+ EndColumn,
Enabled
)
## Query table
#
- # @param Model: The Model of Record
+ # @param Model: The Model of Record
# @param Scope1: Arch of a Dsc item
# @param Scope2: Module type of a Dsc item
# @param BelongsToItem: The item belongs to which another item
# @param FromItem: The item belongs to which dsc file
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model, Scope1=None, Scope2=None, BelongsToItem=None, FromItem=None):
ConditionString = "Model=%s AND Enabled>0" % Model
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index a86f19624c44..51772e768a8c 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -32,7 +32,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
Element = Doc.createElement(Name)
if String != '' and String is not None:
Element.appendChild(Doc.createTextNode(String))
-
+
for Item in NodeList:
if type(Item) == type([]):
Key = Item[0]
@@ -48,7 +48,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
Value = Item[1]
if Key != '' and Key is not None and Value != '' and Value is not None:
Element.setAttribute(Key, Value)
-
+
return Element
## Get a list of XML nodes using XPath style syntax.
diff --git a/BaseTools/Source/Python/Ecc/Xml/__init__.py b/BaseTools/Source/Python/Ecc/Xml/__init__.py
index f09eece5fb0e..4035345f225d 100644
--- a/BaseTools/Source/Python/Ecc/Xml/__init__.py
+++ b/BaseTools/Source/Python/Ecc/Xml/__init__.py
@@ -6,9 +6,9 @@
#
# Copyright (c) 2011, Intel Corporation. All rights reserved.<BR>
#
-# This program and the accompanying materials are licensed and made available
-# under the terms and conditions of the BSD License which accompanies this
-# distribution. The full text of the license may be found at
+# This program and the accompanying materials are licensed and made available
+# under the terms and conditions of the BSD License which accompanies this
+# distribution. The full text of the license may be found at
# http://opensource.org/licenses/bsd-license.php
#
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 175e2d2e0439..d10d12a38724 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -2348,13 +2348,13 @@ def CheckFileHeaderDoxygenComments(FullFileName):
if (len(CommentStrListTemp) <= 1):
# For Mac
CommentStrListTemp = CommentStr.split('\r')
- # Skip the content before the file header
+ # Skip the content before the file header
for CommentLine in CommentStrListTemp:
if CommentLine.strip().startswith('/** @file'):
FileStartFlag = True
if FileStartFlag == True:
CommentStrList.append(CommentLine)
-
+
ID = Result[1]
Index = 0
if CommentStrList and CommentStrList[0].strip().startswith('/** @file'):
@@ -2377,7 +2377,7 @@ def CheckFileHeaderDoxygenComments(FullFileName):
if EccGlobalData.gConfig.HeaderCheckCFileCommentStartSpacesNum == '1' or EccGlobalData.gConfig.HeaderCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
if CommentLine.startswith('/** @file') == False and CommentLine.startswith('**/') == False and CommentLine.strip() and CommentLine.startswith(' ') == False:
PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'File header comment content should start with two spaces at each line', FileTable, ID)
-
+
CommentLine = CommentLine.strip()
if CommentLine.startswith('Copyright'):
NoCopyrightFlag = False
@@ -2402,9 +2402,9 @@ def CheckFileHeaderDoxygenComments(FullFileName):
# Check whether C File header Comment's each reference at list should begin with a bullet character.
if EccGlobalData.gConfig.HeaderCheckCFileCommentReferenceFormat == '1' or EccGlobalData.gConfig.HeaderCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
if RefListFlag == True:
- if RefLine.strip() and RefLine.strip().startswith('**/') == False and RefLine.startswith(' -') == False:
- PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'Each reference on a separate line should begin with a bullet character ""-"" ', FileTable, ID)
-
+ if RefLine.strip() and RefLine.strip().startswith('**/') == False and RefLine.startswith(' -') == False:
+ PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'Each reference on a separate line should begin with a bullet character ""-"" ', FileTable, ID)
+
if NoHeaderCommentStartFlag:
PrintErrorMsg(ERROR_DOXYGEN_CHECK_FILE_HEADER, 'File header comment should begin with ""/** @file""', FileTable, ID)
return
diff --git a/BaseTools/Source/Python/Eot/CLexer.py b/BaseTools/Source/Python/Eot/CLexer.py
index a496f4344030..c7956e8ddae6 100644
--- a/BaseTools/Source/Python/Eot/CLexer.py
+++ b/BaseTools/Source/Python/Eot/CLexer.py
@@ -2,7 +2,7 @@
from antlr3 import *
from antlr3.compat import set, frozenset
-
+
## @file
# The file defines the Lexer for C source files.
#
@@ -4341,7 +4341,7 @@ class CLexer(Lexer):
u"\12\uffff"
)
-
+
DFA25_transition = [
DFA.unpack(u"\1\2\1\uffff\12\1"),
DFA.unpack(u"\1\3\1\uffff\12\1\12\uffff\1\5\1\4\1\5\35\uffff\1\5"
@@ -4479,7 +4479,7 @@ class CLexer(Lexer):
u"\u0192\uffff"
)
-
+
DFA35_transition = [
DFA.unpack(u"\6\73\2\70\1\73\2\70\22\73\1\70\1\50\1\65\1\72\1\63"
u"\1\45\1\46\1\64\1\34\1\35\1\40\1\42\1\3\1\43\1\41\1\44\1\66\11"
@@ -4943,5 +4943,5 @@ class CLexer(Lexer):
# class definition for DFA #35
DFA35 = DFA
-
+
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index 94711a9a378a..e817af86f702 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -2,7 +2,7 @@
from antlr3 import *
from antlr3.compat import set, frozenset
-
+
## @file
# The file defines the parser for C source files.
#
@@ -56,23 +56,23 @@ OctalEscape=17
# token names
tokenNames = [
- "<invalid>", "<EOR>", "<DOWN>", "<UP>",
- "IDENTIFIER", "HEX_LITERAL", "OCTAL_LITERAL", "DECIMAL_LITERAL", "CHARACTER_LITERAL",
- "STRING_LITERAL", "FLOATING_POINT_LITERAL", "LETTER", "EscapeSequence",
- "HexDigit", "IntegerTypeSuffix", "Exponent", "FloatTypeSuffix", "OctalEscape",
- "UnicodeEscape", "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
- "LINE_COMMAND", "';'", "'typedef'", "','", "'='", "'extern'", "'static'",
- "'auto'", "'register'", "'STATIC'", "'void'", "'char'", "'short'", "'int'",
- "'long'", "'float'", "'double'", "'signed'", "'unsigned'", "'{'", "'}'",
- "'struct'", "'union'", "':'", "'enum'", "'const'", "'volatile'", "'IN'",
- "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'", "'VOLATILE'", "'GLOBAL_REMOVE_IF_UNREFERENCED'",
- "'EFIAPI'", "'EFI_BOOTSERVICE'", "'EFI_RUNTIMESERVICE'", "'PACKED'",
- "'('", "')'", "'['", "']'", "'*'", "'...'", "'+'", "'-'", "'/'", "'%'",
- "'++'", "'--'", "'sizeof'", "'.'", "'->'", "'&'", "'~'", "'!'", "'*='",
- "'/='", "'%='", "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
- "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='", "'<'", "'>'", "'<='",
- "'>='", "'<<'", "'>>'", "'__asm__'", "'_asm'", "'__asm'", "'case'",
- "'default'", "'if'", "'else'", "'switch'", "'while'", "'do'", "'for'",
+ "<invalid>", "<EOR>", "<DOWN>", "<UP>",
+ "IDENTIFIER", "HEX_LITERAL", "OCTAL_LITERAL", "DECIMAL_LITERAL", "CHARACTER_LITERAL",
+ "STRING_LITERAL", "FLOATING_POINT_LITERAL", "LETTER", "EscapeSequence",
+ "HexDigit", "IntegerTypeSuffix", "Exponent", "FloatTypeSuffix", "OctalEscape",
+ "UnicodeEscape", "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
+ "LINE_COMMAND", "';'", "'typedef'", "','", "'='", "'extern'", "'static'",
+ "'auto'", "'register'", "'STATIC'", "'void'", "'char'", "'short'", "'int'",
+ "'long'", "'float'", "'double'", "'signed'", "'unsigned'", "'{'", "'}'",
+ "'struct'", "'union'", "':'", "'enum'", "'const'", "'volatile'", "'IN'",
+ "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'", "'VOLATILE'", "'GLOBAL_REMOVE_IF_UNREFERENCED'",
+ "'EFIAPI'", "'EFI_BOOTSERVICE'", "'EFI_RUNTIMESERVICE'", "'PACKED'",
+ "'('", "')'", "'['", "']'", "'*'", "'...'", "'+'", "'-'", "'/'", "'%'",
+ "'++'", "'--'", "'sizeof'", "'.'", "'->'", "'&'", "'~'", "'!'", "'*='",
+ "'/='", "'%='", "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
+ "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='", "'<'", "'>'", "'<='",
+ "'>='", "'<<'", "'>>'", "'__asm__'", "'_asm'", "'__asm'", "'case'",
+ "'default'", "'if'", "'else'", "'switch'", "'while'", "'do'", "'for'",
"'goto'", "'continue'", "'break'", "'return'"
]
@@ -103,7 +103,7 @@ class CParser(Parser):
def printTokenInfo(self, line, offset, tokenText):
print str(line)+ ',' + str(offset) + ':' + str(tokenText)
-
+
def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
FileProfile.PredicateExpressionList.append(PredExp)
@@ -119,7 +119,7 @@ class CParser(Parser):
def StoreTypedefDefinition(self, StartLine, StartOffset, EndLine, EndOffset, FromText, ToText):
Tdef = CodeFragment.TypedefDefinition(FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
FileProfile.TypedefDefinitionList.append(Tdef)
-
+
def StoreFunctionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText, LeftBraceLine, LeftBraceOffset, DeclLine, DeclOffset):
FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
FileProfile.FunctionDefinitionList.append(FuncDef)
@@ -127,11 +127,11 @@ class CParser(Parser):
def StoreVariableDeclaration(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText):
VarDecl = CodeFragment.VariableDeclaration(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
FileProfile.VariableDeclarationList.append(VarDecl)
-
+
def StoreFunctionCalling(self, StartLine, StartOffset, EndLine, EndOffset, FuncName, ParamList):
FuncCall = CodeFragment.FunctionCalling(FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
FileProfile.FunctionCallingList.append(FuncCall)
-
+
@@ -143,7 +143,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 1):
- return
+ return
# C.g:103:2: ( ( external_declaration )* )
# C.g:103:4: ( external_declaration )*
@@ -162,7 +162,7 @@ class CParser(Parser):
self.external_declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -182,7 +182,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end translation_unit
@@ -195,7 +195,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 2):
- return
+ return
# C.g:119:2: ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? )
alt3 = 3
@@ -211,7 +211,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 1, self.input)
@@ -227,7 +227,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 2, self.input)
@@ -243,7 +243,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 3, self.input)
@@ -259,7 +259,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 4, self.input)
@@ -275,7 +275,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 5, self.input)
@@ -291,7 +291,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 6, self.input)
@@ -307,7 +307,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 7, self.input)
@@ -323,7 +323,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 8, self.input)
@@ -339,7 +339,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 9, self.input)
@@ -355,7 +355,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 10, self.input)
@@ -371,7 +371,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 11, self.input)
@@ -387,7 +387,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 12, self.input)
@@ -405,7 +405,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 13, self.input)
@@ -421,7 +421,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 14, self.input)
@@ -439,7 +439,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 16, self.input)
@@ -455,7 +455,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 17, self.input)
@@ -471,7 +471,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 18, self.input)
@@ -484,7 +484,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 0, self.input)
@@ -496,7 +496,7 @@ class CParser(Parser):
self.function_definition()
self.following.pop()
if self.failed:
- return
+ return
elif alt3 == 2:
@@ -505,7 +505,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
elif alt3 == 3:
@@ -514,7 +514,7 @@ class CParser(Parser):
self.macro_statement()
self.following.pop()
if self.failed:
- return
+ return
# C.g:121:20: ( ';' )?
alt2 = 2
LA2_0 = self.input.LA(1)
@@ -525,7 +525,7 @@ class CParser(Parser):
# C.g:121:21: ';'
self.match(self.input, 25, self.FOLLOW_25_in_external_declaration126)
if self.failed:
- return
+ return
@@ -541,7 +541,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end external_declaration
@@ -568,7 +568,7 @@ class CParser(Parser):
declarator1 = None
-
+
self.function_definition_stack[-1].ModifierText = ''
self.function_definition_stack[-1].DeclText = ''
self.function_definition_stack[-1].LBLine = 0
@@ -782,7 +782,7 @@ class CParser(Parser):
if self.backtracking == 0:
-
+
if d is not None:
self.function_definition_stack[-1].ModifierText = self.input.toString(d.start,d.stop)
else:
@@ -796,7 +796,7 @@ class CParser(Parser):
else:
self.function_definition_stack[-1].LBLine = b.start.line
self.function_definition_stack[-1].LBOffset = b.start.charPositionInLine
-
+
@@ -804,7 +804,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
if self.backtracking == 0:
-
+
self.StoreFunctionDefinition(retval.start.line, retval.start.charPositionInLine, retval.stop.line, retval.stop.charPositionInLine, self.function_definition_stack[-1].ModifierText, self.function_definition_stack[-1].DeclText, self.function_definition_stack[-1].LBLine, self.function_definition_stack[-1].LBOffset, self.function_definition_stack[-1].DeclLine, self.function_definition_stack[-1].DeclOffset)
@@ -844,7 +844,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 4):
- return
+ return
# C.g:167:2: (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' )
alt9 = 2
@@ -857,7 +857,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("166:1: declaration : (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' );", 9, 0, self.input)
@@ -868,7 +868,7 @@ class CParser(Parser):
a = self.input.LT(1)
self.match(self.input, 26, self.FOLLOW_26_in_declaration203)
if self.failed:
- return
+ return
# C.g:167:17: (b= declaration_specifiers )?
alt7 = 2
LA7 = self.input.LA(1)
@@ -905,7 +905,7 @@ class CParser(Parser):
b = self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
@@ -913,18 +913,18 @@ class CParser(Parser):
c = self.init_declarator_list()
self.following.pop()
if self.failed:
- return
+ return
d = self.input.LT(1)
self.match(self.input, 25, self.FOLLOW_25_in_declaration220)
if self.failed:
- return
+ return
if self.backtracking == 0:
-
+
if b is not None:
self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop), self.input.toString(c.start,c.stop))
else:
self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.stop))
-
+
@@ -934,7 +934,7 @@ class CParser(Parser):
s = self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# C.g:175:30: (t= init_declarator_list )?
alt8 = 2
LA8_0 = self.input.LA(1)
@@ -947,16 +947,16 @@ class CParser(Parser):
t = self.init_declarator_list()
self.following.pop()
if self.failed:
- return
+ return
e = self.input.LT(1)
self.match(self.input, 25, self.FOLLOW_25_in_declaration243)
if self.failed:
- return
+ return
if self.backtracking == 0:
-
+
if t is not None:
self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start,s.stop), self.input.toString(t.start,t.stop))
@@ -973,7 +973,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end declaration
@@ -1184,7 +1184,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 7):
- return
+ return
# C.g:194:2: ( declarator ( '=' initializer )? )
# C.g:194:4: declarator ( '=' initializer )?
@@ -1192,7 +1192,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:194:15: ( '=' initializer )?
alt12 = 2
LA12_0 = self.input.LA(1)
@@ -1203,12 +1203,12 @@ class CParser(Parser):
# C.g:194:16: '=' initializer
self.match(self.input, 28, self.FOLLOW_28_in_init_declarator329)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_initializer_in_init_declarator331)
self.initializer()
self.following.pop()
if self.failed:
- return
+ return
@@ -1225,7 +1225,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end init_declarator
@@ -1238,7 +1238,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 8):
- return
+ return
# C.g:198:2: ( 'extern' | 'static' | 'auto' | 'register' | 'STATIC' )
# C.g:
@@ -1250,7 +1250,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -1272,7 +1272,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end storage_class_specifier
@@ -1290,7 +1290,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 9):
- return
+ return
# C.g:206:2: ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id )
alt13 = 12
@@ -1323,7 +1323,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("205:1: type_specifier : ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id );", 13, 0, self.input)
@@ -1333,63 +1333,63 @@ class CParser(Parser):
# C.g:206:4: 'void'
self.match(self.input, 34, self.FOLLOW_34_in_type_specifier376)
if self.failed:
- return
+ return
elif alt13 == 2:
# C.g:207:4: 'char'
self.match(self.input, 35, self.FOLLOW_35_in_type_specifier381)
if self.failed:
- return
+ return
elif alt13 == 3:
# C.g:208:4: 'short'
self.match(self.input, 36, self.FOLLOW_36_in_type_specifier386)
if self.failed:
- return
+ return
elif alt13 == 4:
# C.g:209:4: 'int'
self.match(self.input, 37, self.FOLLOW_37_in_type_specifier391)
if self.failed:
- return
+ return
elif alt13 == 5:
# C.g:210:4: 'long'
self.match(self.input, 38, self.FOLLOW_38_in_type_specifier396)
if self.failed:
- return
+ return
elif alt13 == 6:
# C.g:211:4: 'float'
self.match(self.input, 39, self.FOLLOW_39_in_type_specifier401)
if self.failed:
- return
+ return
elif alt13 == 7:
# C.g:212:4: 'double'
self.match(self.input, 40, self.FOLLOW_40_in_type_specifier406)
if self.failed:
- return
+ return
elif alt13 == 8:
# C.g:213:4: 'signed'
self.match(self.input, 41, self.FOLLOW_41_in_type_specifier411)
if self.failed:
- return
+ return
elif alt13 == 9:
# C.g:214:4: 'unsigned'
self.match(self.input, 42, self.FOLLOW_42_in_type_specifier416)
if self.failed:
- return
+ return
elif alt13 == 10:
@@ -1398,9 +1398,9 @@ class CParser(Parser):
s = self.struct_or_union_specifier()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
-
+
if s.stop is not None:
self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start,s.stop))
@@ -1413,9 +1413,9 @@ class CParser(Parser):
e = self.enum_specifier()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
-
+
if e.stop is not None:
self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -1428,7 +1428,7 @@ class CParser(Parser):
self.type_id()
self.following.pop()
if self.failed:
- return
+ return
@@ -1441,7 +1441,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end type_specifier
@@ -1454,13 +1454,13 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 10):
- return
+ return
# C.g:229:5: ( IDENTIFIER )
# C.g:229:9: IDENTIFIER
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_type_id467)
if self.failed:
- return
+ return
@@ -1474,7 +1474,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end type_id
@@ -1611,7 +1611,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 12):
- return
+ return
# C.g:240:2: ( 'struct' | 'union' )
# C.g:
@@ -1623,7 +1623,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -1645,7 +1645,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_or_union
@@ -1658,7 +1658,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 13):
- return
+ return
# C.g:245:2: ( ( struct_declaration )+ )
# C.g:245:4: ( struct_declaration )+
@@ -1678,7 +1678,7 @@ class CParser(Parser):
self.struct_declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -1687,7 +1687,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(16, self.input)
raise eee
@@ -1708,7 +1708,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_declaration_list
@@ -1721,7 +1721,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 14):
- return
+ return
# C.g:249:2: ( specifier_qualifier_list struct_declarator_list ';' )
# C.g:249:4: specifier_qualifier_list struct_declarator_list ';'
@@ -1729,15 +1729,15 @@ class CParser(Parser):
self.specifier_qualifier_list()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_struct_declarator_list_in_struct_declaration551)
self.struct_declarator_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_struct_declaration553)
if self.failed:
- return
+ return
@@ -1751,7 +1751,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_declaration
@@ -1764,7 +1764,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 15):
- return
+ return
# C.g:253:2: ( ( type_qualifier | type_specifier )+ )
# C.g:253:4: ( type_qualifier | type_specifier )+
@@ -1831,7 +1831,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
elif alt17 == 2:
@@ -1840,7 +1840,7 @@ class CParser(Parser):
self.type_specifier()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -1849,7 +1849,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(17, self.input)
raise eee
@@ -1870,7 +1870,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end specifier_qualifier_list
@@ -1883,7 +1883,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 16):
- return
+ return
# C.g:257:2: ( struct_declarator ( ',' struct_declarator )* )
# C.g:257:4: struct_declarator ( ',' struct_declarator )*
@@ -1891,7 +1891,7 @@ class CParser(Parser):
self.struct_declarator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:257:22: ( ',' struct_declarator )*
while True: #loop18
alt18 = 2
@@ -1905,12 +1905,12 @@ class CParser(Parser):
# C.g:257:23: ',' struct_declarator
self.match(self.input, 27, self.FOLLOW_27_in_struct_declarator_list587)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_struct_declarator_in_struct_declarator_list589)
self.struct_declarator()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -1930,7 +1930,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_declarator_list
@@ -1943,7 +1943,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 17):
- return
+ return
# C.g:261:2: ( declarator ( ':' constant_expression )? | ':' constant_expression )
alt20 = 2
@@ -1956,7 +1956,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("260:1: struct_declarator : ( declarator ( ':' constant_expression )? | ':' constant_expression );", 20, 0, self.input)
@@ -1968,7 +1968,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:261:15: ( ':' constant_expression )?
alt19 = 2
LA19_0 = self.input.LA(1)
@@ -1979,12 +1979,12 @@ class CParser(Parser):
# C.g:261:16: ':' constant_expression
self.match(self.input, 47, self.FOLLOW_47_in_struct_declarator605)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_struct_declarator607)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -1994,12 +1994,12 @@ class CParser(Parser):
# C.g:262:4: ':' constant_expression
self.match(self.input, 47, self.FOLLOW_47_in_struct_declarator614)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_struct_declarator616)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -2012,7 +2012,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end struct_declarator
@@ -2180,7 +2180,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 19):
- return
+ return
# C.g:273:2: ( enumerator ( ',' enumerator )* )
# C.g:273:4: enumerator ( ',' enumerator )*
@@ -2188,7 +2188,7 @@ class CParser(Parser):
self.enumerator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:273:15: ( ',' enumerator )*
while True: #loop24
alt24 = 2
@@ -2207,12 +2207,12 @@ class CParser(Parser):
# C.g:273:16: ',' enumerator
self.match(self.input, 27, self.FOLLOW_27_in_enumerator_list680)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_enumerator_in_enumerator_list682)
self.enumerator()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -2232,7 +2232,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end enumerator_list
@@ -2245,13 +2245,13 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 20):
- return
+ return
# C.g:277:2: ( IDENTIFIER ( '=' constant_expression )? )
# C.g:277:4: IDENTIFIER ( '=' constant_expression )?
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_enumerator695)
if self.failed:
- return
+ return
# C.g:277:15: ( '=' constant_expression )?
alt25 = 2
LA25_0 = self.input.LA(1)
@@ -2262,12 +2262,12 @@ class CParser(Parser):
# C.g:277:16: '=' constant_expression
self.match(self.input, 28, self.FOLLOW_28_in_enumerator698)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_enumerator700)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -2284,7 +2284,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end enumerator
@@ -2297,7 +2297,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 21):
- return
+ return
# C.g:281:2: ( 'const' | 'volatile' | 'IN' | 'OUT' | 'OPTIONAL' | 'CONST' | 'UNALIGNED' | 'VOLATILE' | 'GLOBAL_REMOVE_IF_UNREFERENCED' | 'EFIAPI' | 'EFI_BOOTSERVICE' | 'EFI_RUNTIMESERVICE' | 'PACKED' )
# C.g:
@@ -2309,7 +2309,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -2331,7 +2331,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end type_qualifier
@@ -2486,7 +2486,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 23):
- return
+ return
# C.g:303:2: ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ )
alt34 = 2
@@ -2499,7 +2499,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("302:1: direct_declarator : ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ );", 34, 0, self.input)
@@ -2509,7 +2509,7 @@ class CParser(Parser):
# C.g:303:4: IDENTIFIER ( declarator_suffix )*
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_direct_declarator819)
if self.failed:
- return
+ return
# C.g:303:15: ( declarator_suffix )*
while True: #loop31
alt31 = 2
@@ -2753,7 +2753,7 @@ class CParser(Parser):
self.declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -2766,7 +2766,7 @@ class CParser(Parser):
# C.g:304:4: '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+
self.match(self.input, 62, self.FOLLOW_62_in_direct_declarator827)
if self.failed:
- return
+ return
# C.g:304:8: ( 'EFIAPI' )?
alt32 = 2
LA32_0 = self.input.LA(1)
@@ -2780,7 +2780,7 @@ class CParser(Parser):
# C.g:304:9: 'EFIAPI'
self.match(self.input, 58, self.FOLLOW_58_in_direct_declarator830)
if self.failed:
- return
+ return
@@ -2788,10 +2788,10 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_direct_declarator836)
if self.failed:
- return
+ return
# C.g:304:35: ( declarator_suffix )+
cnt33 = 0
while True: #loop33
@@ -3036,7 +3036,7 @@ class CParser(Parser):
self.declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3045,7 +3045,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(33, self.input)
raise eee
@@ -3065,7 +3065,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end direct_declarator
@@ -3078,7 +3078,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 24):
- return
+ return
# C.g:308:2: ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' )
alt35 = 5
@@ -3094,7 +3094,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 1, self.input)
@@ -3116,7 +3116,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 29, self.input)
@@ -3125,7 +3125,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 2, self.input)
@@ -3134,7 +3134,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 0, self.input)
@@ -3144,65 +3144,65 @@ class CParser(Parser):
# C.g:308:6: '[' constant_expression ']'
self.match(self.input, 64, self.FOLLOW_64_in_declarator_suffix852)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_declarator_suffix854)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_declarator_suffix856)
if self.failed:
- return
+ return
elif alt35 == 2:
# C.g:309:9: '[' ']'
self.match(self.input, 64, self.FOLLOW_64_in_declarator_suffix866)
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_declarator_suffix868)
if self.failed:
- return
+ return
elif alt35 == 3:
# C.g:310:9: '(' parameter_type_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix878)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_parameter_type_list_in_declarator_suffix880)
self.parameter_type_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix882)
if self.failed:
- return
+ return
elif alt35 == 4:
# C.g:311:9: '(' identifier_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix892)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_identifier_list_in_declarator_suffix894)
self.identifier_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix896)
if self.failed:
- return
+ return
elif alt35 == 5:
# C.g:312:9: '(' ')'
self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix906)
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix908)
if self.failed:
- return
+ return
@@ -3215,7 +3215,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end declarator_suffix
@@ -3228,7 +3228,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 25):
- return
+ return
# C.g:316:2: ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' )
alt38 = 3
@@ -3246,7 +3246,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 2, self.input)
@@ -3262,7 +3262,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 3, self.input)
@@ -3278,7 +3278,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 4, self.input)
@@ -3294,7 +3294,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 5, self.input)
@@ -3312,7 +3312,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 21, self.input)
@@ -3328,7 +3328,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 29, self.input)
@@ -3337,7 +3337,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 1, self.input)
@@ -3346,7 +3346,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 0, self.input)
@@ -3356,7 +3356,7 @@ class CParser(Parser):
# C.g:316:4: '*' ( type_qualifier )+ ( pointer )?
self.match(self.input, 66, self.FOLLOW_66_in_pointer919)
if self.failed:
- return
+ return
# C.g:316:8: ( type_qualifier )+
cnt36 = 0
while True: #loop36
@@ -3404,7 +3404,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3413,7 +3413,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(36, self.input)
raise eee
@@ -3436,7 +3436,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
@@ -3446,19 +3446,19 @@ class CParser(Parser):
# C.g:317:4: '*' pointer
self.match(self.input, 66, self.FOLLOW_66_in_pointer930)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_pointer_in_pointer932)
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
elif alt38 == 3:
# C.g:318:4: '*'
self.match(self.input, 66, self.FOLLOW_66_in_pointer937)
if self.failed:
- return
+ return
@@ -3471,7 +3471,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end pointer
@@ -3484,7 +3484,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 26):
- return
+ return
# C.g:322:2: ( parameter_list ( ',' ( 'OPTIONAL' )? '...' )? )
# C.g:322:4: parameter_list ( ',' ( 'OPTIONAL' )? '...' )?
@@ -3492,7 +3492,7 @@ class CParser(Parser):
self.parameter_list()
self.following.pop()
if self.failed:
- return
+ return
# C.g:322:19: ( ',' ( 'OPTIONAL' )? '...' )?
alt40 = 2
LA40_0 = self.input.LA(1)
@@ -3503,7 +3503,7 @@ class CParser(Parser):
# C.g:322:20: ',' ( 'OPTIONAL' )? '...'
self.match(self.input, 27, self.FOLLOW_27_in_parameter_type_list951)
if self.failed:
- return
+ return
# C.g:322:24: ( 'OPTIONAL' )?
alt39 = 2
LA39_0 = self.input.LA(1)
@@ -3514,13 +3514,13 @@ class CParser(Parser):
# C.g:322:25: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_parameter_type_list954)
if self.failed:
- return
+ return
self.match(self.input, 67, self.FOLLOW_67_in_parameter_type_list958)
if self.failed:
- return
+ return
@@ -3537,7 +3537,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end parameter_type_list
@@ -3550,7 +3550,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 27):
- return
+ return
# C.g:326:2: ( parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )* )
# C.g:326:4: parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )*
@@ -3558,7 +3558,7 @@ class CParser(Parser):
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
# C.g:326:26: ( ',' ( 'OPTIONAL' )? parameter_declaration )*
while True: #loop42
alt42 = 2
@@ -3584,7 +3584,7 @@ class CParser(Parser):
# C.g:326:27: ',' ( 'OPTIONAL' )? parameter_declaration
self.match(self.input, 27, self.FOLLOW_27_in_parameter_list974)
if self.failed:
- return
+ return
# C.g:326:31: ( 'OPTIONAL' )?
alt41 = 2
LA41_0 = self.input.LA(1)
@@ -3598,7 +3598,7 @@ class CParser(Parser):
# C.g:326:32: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_parameter_list977)
if self.failed:
- return
+ return
@@ -3606,7 +3606,7 @@ class CParser(Parser):
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3626,7 +3626,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end parameter_list
@@ -3639,7 +3639,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 28):
- return
+ return
# C.g:330:2: ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER )
alt46 = 2
@@ -3656,7 +3656,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 13, self.input)
@@ -3667,7 +3667,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 0, self.input)
@@ -3679,7 +3679,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# C.g:330:27: ( declarator | abstract_declarator )*
while True: #loop43
alt43 = 3
@@ -3763,7 +3763,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
elif alt43 == 2:
@@ -3772,7 +3772,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3789,7 +3789,7 @@ class CParser(Parser):
# C.g:330:62: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_parameter_declaration1004)
if self.failed:
- return
+ return
@@ -3812,7 +3812,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -3821,7 +3821,7 @@ class CParser(Parser):
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_parameter_declaration1016)
if self.failed:
- return
+ return
@@ -3834,7 +3834,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end parameter_declaration
@@ -3847,13 +3847,13 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 29):
- return
+ return
# C.g:336:2: ( IDENTIFIER ( ',' IDENTIFIER )* )
# C.g:336:4: IDENTIFIER ( ',' IDENTIFIER )*
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_identifier_list1027)
if self.failed:
- return
+ return
# C.g:337:2: ( ',' IDENTIFIER )*
while True: #loop47
alt47 = 2
@@ -3867,10 +3867,10 @@ class CParser(Parser):
# C.g:337:3: ',' IDENTIFIER
self.match(self.input, 27, self.FOLLOW_27_in_identifier_list1031)
if self.failed:
- return
+ return
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_identifier_list1033)
if self.failed:
- return
+ return
else:
@@ -3890,7 +3890,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end identifier_list
@@ -3903,7 +3903,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 30):
- return
+ return
# C.g:341:2: ( specifier_qualifier_list ( abstract_declarator )? | type_id )
alt49 = 2
@@ -3921,7 +3921,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 13, self.input)
@@ -3930,7 +3930,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 0, self.input)
@@ -3942,7 +3942,7 @@ class CParser(Parser):
self.specifier_qualifier_list()
self.following.pop()
if self.failed:
- return
+ return
# C.g:341:29: ( abstract_declarator )?
alt48 = 2
LA48_0 = self.input.LA(1)
@@ -3955,7 +3955,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
@@ -3967,7 +3967,7 @@ class CParser(Parser):
self.type_id()
self.following.pop()
if self.failed:
- return
+ return
@@ -3980,7 +3980,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end type_name
@@ -3993,7 +3993,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 31):
- return
+ return
# C.g:346:2: ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator )
alt51 = 2
@@ -4006,7 +4006,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("345:1: abstract_declarator : ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator );", 51, 0, self.input)
@@ -4018,7 +4018,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
# C.g:346:12: ( direct_abstract_declarator )?
alt50 = 2
LA50_0 = self.input.LA(1)
@@ -4203,7 +4203,7 @@ class CParser(Parser):
self.direct_abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
@@ -4215,7 +4215,7 @@ class CParser(Parser):
self.direct_abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
@@ -4228,7 +4228,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end abstract_declarator
@@ -4241,7 +4241,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 32):
- return
+ return
# C.g:351:2: ( ( '(' abstract_declarator ')' | abstract_declarator_suffix ) ( abstract_declarator_suffix )* )
# C.g:351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix ) ( abstract_declarator_suffix )*
@@ -4263,7 +4263,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 18, self.input)
@@ -4274,7 +4274,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 1, self.input)
@@ -4285,7 +4285,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 0, self.input)
@@ -4295,15 +4295,15 @@ class CParser(Parser):
# C.g:351:6: '(' abstract_declarator ')'
self.match(self.input, 62, self.FOLLOW_62_in_direct_abstract_declarator1086)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_abstract_declarator_in_direct_abstract_declarator1088)
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_direct_abstract_declarator1090)
if self.failed:
- return
+ return
elif alt52 == 2:
@@ -4312,7 +4312,7 @@ class CParser(Parser):
self.abstract_declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
@@ -4559,7 +4559,7 @@ class CParser(Parser):
self.abstract_declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -4579,7 +4579,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end direct_abstract_declarator
@@ -4592,7 +4592,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 33):
- return
+ return
# C.g:355:2: ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' )
alt54 = 4
@@ -4608,7 +4608,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 1, self.input)
@@ -4624,7 +4624,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 2, self.input)
@@ -4633,7 +4633,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 0, self.input)
@@ -4643,50 +4643,50 @@ class CParser(Parser):
# C.g:355:4: '[' ']'
self.match(self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1110)
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1112)
if self.failed:
- return
+ return
elif alt54 == 2:
# C.g:356:4: '[' constant_expression ']'
self.match(self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1117)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_abstract_declarator_suffix1119)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1121)
if self.failed:
- return
+ return
elif alt54 == 3:
# C.g:357:4: '(' ')'
self.match(self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1126)
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1128)
if self.failed:
- return
+ return
elif alt54 == 4:
# C.g:358:4: '(' parameter_type_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1133)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135)
self.parameter_type_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1137)
if self.failed:
- return
+ return
@@ -4699,7 +4699,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end abstract_declarator_suffix
@@ -4712,7 +4712,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 34):
- return
+ return
# C.g:363:2: ( assignment_expression | '{' initializer_list ( ',' )? '}' )
alt56 = 2
@@ -4725,7 +4725,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("361:1: initializer : ( assignment_expression | '{' initializer_list ( ',' )? '}' );", 56, 0, self.input)
@@ -4737,19 +4737,19 @@ class CParser(Parser):
self.assignment_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt56 == 2:
# C.g:364:4: '{' initializer_list ( ',' )? '}'
self.match(self.input, 43, self.FOLLOW_43_in_initializer1155)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_initializer_list_in_initializer1157)
self.initializer_list()
self.following.pop()
if self.failed:
- return
+ return
# C.g:364:25: ( ',' )?
alt55 = 2
LA55_0 = self.input.LA(1)
@@ -4760,13 +4760,13 @@ class CParser(Parser):
# C.g:0:0: ','
self.match(self.input, 27, self.FOLLOW_27_in_initializer1159)
if self.failed:
- return
+ return
self.match(self.input, 44, self.FOLLOW_44_in_initializer1162)
if self.failed:
- return
+ return
@@ -4779,7 +4779,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end initializer
@@ -4792,7 +4792,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 35):
- return
+ return
# C.g:368:2: ( initializer ( ',' initializer )* )
# C.g:368:4: initializer ( ',' initializer )*
@@ -4800,7 +4800,7 @@ class CParser(Parser):
self.initializer()
self.following.pop()
if self.failed:
- return
+ return
# C.g:368:16: ( ',' initializer )*
while True: #loop57
alt57 = 2
@@ -4819,12 +4819,12 @@ class CParser(Parser):
# C.g:368:17: ',' initializer
self.match(self.input, 27, self.FOLLOW_27_in_initializer_list1176)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_initializer_in_initializer_list1178)
self.initializer()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -4844,7 +4844,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end initializer_list
@@ -4955,7 +4955,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 37):
- return
+ return
# C.g:378:2: ( ( multiplicative_expression ) ( '+' multiplicative_expression | '-' multiplicative_expression )* )
# C.g:378:4: ( multiplicative_expression ) ( '+' multiplicative_expression | '-' multiplicative_expression )*
@@ -4965,7 +4965,7 @@ class CParser(Parser):
self.multiplicative_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -4984,24 +4984,24 @@ class CParser(Parser):
# C.g:378:33: '+' multiplicative_expression
self.match(self.input, 68, self.FOLLOW_68_in_additive_expression1229)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1231)
self.multiplicative_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt61 == 2:
# C.g:378:65: '-' multiplicative_expression
self.match(self.input, 69, self.FOLLOW_69_in_additive_expression1235)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1237)
self.multiplicative_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -5021,7 +5021,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end additive_expression
@@ -5034,7 +5034,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 38):
- return
+ return
# C.g:382:2: ( ( cast_expression ) ( '*' cast_expression | '/' cast_expression | '%' cast_expression )* )
# C.g:382:4: ( cast_expression ) ( '*' cast_expression | '/' cast_expression | '%' cast_expression )*
@@ -5044,7 +5044,7 @@ class CParser(Parser):
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -5063,36 +5063,36 @@ class CParser(Parser):
# C.g:382:23: '*' cast_expression
self.match(self.input, 66, self.FOLLOW_66_in_multiplicative_expression1255)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1257)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt62 == 2:
# C.g:382:45: '/' cast_expression
self.match(self.input, 70, self.FOLLOW_70_in_multiplicative_expression1261)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1263)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt62 == 3:
# C.g:382:67: '%' cast_expression
self.match(self.input, 71, self.FOLLOW_71_in_multiplicative_expression1267)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1269)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -5112,7 +5112,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end multiplicative_expression
@@ -5125,7 +5125,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 39):
- return
+ return
# C.g:386:2: ( '(' type_name ')' cast_expression | unary_expression )
alt63 = 2
@@ -5145,7 +5145,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 25, self.input)
@@ -5156,7 +5156,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 1, self.input)
@@ -5167,7 +5167,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 0, self.input)
@@ -5177,20 +5177,20 @@ class CParser(Parser):
# C.g:386:4: '(' type_name ')' cast_expression
self.match(self.input, 62, self.FOLLOW_62_in_cast_expression1282)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_type_name_in_cast_expression1284)
self.type_name()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_cast_expression1286)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_cast_expression1288)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt63 == 2:
@@ -5199,7 +5199,7 @@ class CParser(Parser):
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -5212,7 +5212,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end cast_expression
@@ -5225,7 +5225,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 40):
- return
+ return
# C.g:391:2: ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' )
alt64 = 6
@@ -5251,7 +5251,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 13, self.input)
@@ -5262,7 +5262,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 12, self.input)
@@ -5271,7 +5271,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 0, self.input)
@@ -5283,31 +5283,31 @@ class CParser(Parser):
self.postfix_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 2:
# C.g:392:4: '++' unary_expression
self.match(self.input, 72, self.FOLLOW_72_in_unary_expression1309)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_unary_expression_in_unary_expression1311)
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 3:
# C.g:393:4: '--' unary_expression
self.match(self.input, 73, self.FOLLOW_73_in_unary_expression1316)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_unary_expression_in_unary_expression1318)
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 4:
@@ -5316,42 +5316,42 @@ class CParser(Parser):
self.unary_operator()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_unary_expression1325)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 5:
# C.g:395:4: 'sizeof' unary_expression
self.match(self.input, 74, self.FOLLOW_74_in_unary_expression1330)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_unary_expression_in_unary_expression1332)
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt64 == 6:
# C.g:396:4: 'sizeof' '(' type_name ')'
self.match(self.input, 74, self.FOLLOW_74_in_unary_expression1337)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_unary_expression1339)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_type_name_in_unary_expression1341)
self.type_name()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_unary_expression1343)
if self.failed:
- return
+ return
@@ -5364,7 +5364,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end unary_expression
@@ -5384,13 +5384,13 @@ class CParser(Parser):
c = None
-
+
self.postfix_expression_stack[-1].FuncCallText = ''
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 41):
- return
+ return
# C.g:406:2: (p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )* )
# C.g:406:6: p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
@@ -5398,7 +5398,7 @@ class CParser(Parser):
p = self.primary_expression()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start,p.stop)
@@ -5460,26 +5460,26 @@ class CParser(Parser):
# C.g:407:13: '[' expression ']'
self.match(self.input, 64, self.FOLLOW_64_in_postfix_expression1383)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_postfix_expression1385)
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 65, self.FOLLOW_65_in_postfix_expression1387)
if self.failed:
- return
+ return
elif alt65 == 2:
# C.g:408:13: '(' a= ')'
self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1401)
if self.failed:
- return
+ return
a = self.input.LT(1)
self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1405)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, a.line, a.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, '')
@@ -5489,16 +5489,16 @@ class CParser(Parser):
# C.g:409:13: '(' c= argument_expression_list b= ')'
self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1420)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_argument_expression_list_in_postfix_expression1424)
c = self.argument_expression_list()
self.following.pop()
if self.failed:
- return
+ return
b = self.input.LT(1)
self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1428)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start,c.stop))
@@ -5508,26 +5508,26 @@ class CParser(Parser):
# C.g:410:13: '(' macro_parameter_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1444)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_macro_parameter_list_in_postfix_expression1446)
self.macro_parameter_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1448)
if self.failed:
- return
+ return
elif alt65 == 5:
# C.g:411:13: '.' x= IDENTIFIER
self.match(self.input, 75, self.FOLLOW_75_in_postfix_expression1462)
if self.failed:
- return
+ return
x = self.input.LT(1)
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1466)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.postfix_expression_stack[-1].FuncCallText += '.' + x.text
@@ -5537,11 +5537,11 @@ class CParser(Parser):
# C.g:412:13: '*' y= IDENTIFIER
self.match(self.input, 66, self.FOLLOW_66_in_postfix_expression1482)
if self.failed:
- return
+ return
y = self.input.LT(1)
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1486)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.postfix_expression_stack[-1].FuncCallText = y.text
@@ -5551,11 +5551,11 @@ class CParser(Parser):
# C.g:413:13: '->' z= IDENTIFIER
self.match(self.input, 76, self.FOLLOW_76_in_postfix_expression1502)
if self.failed:
- return
+ return
z = self.input.LT(1)
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1506)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.postfix_expression_stack[-1].FuncCallText += '->' + z.text
@@ -5565,14 +5565,14 @@ class CParser(Parser):
# C.g:414:13: '++'
self.match(self.input, 72, self.FOLLOW_72_in_postfix_expression1522)
if self.failed:
- return
+ return
elif alt65 == 9:
# C.g:415:13: '--'
self.match(self.input, 73, self.FOLLOW_73_in_postfix_expression1536)
if self.failed:
- return
+ return
else:
@@ -5593,7 +5593,7 @@ class CParser(Parser):
self.postfix_expression_stack.pop()
pass
- return
+ return
# $ANTLR end postfix_expression
@@ -5606,7 +5606,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 42):
- return
+ return
# C.g:420:2: ( parameter_declaration ( ',' parameter_declaration )* )
# C.g:420:4: parameter_declaration ( ',' parameter_declaration )*
@@ -5614,7 +5614,7 @@ class CParser(Parser):
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
# C.g:420:26: ( ',' parameter_declaration )*
while True: #loop66
alt66 = 2
@@ -5628,12 +5628,12 @@ class CParser(Parser):
# C.g:420:27: ',' parameter_declaration
self.match(self.input, 27, self.FOLLOW_27_in_macro_parameter_list1562)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_parameter_declaration_in_macro_parameter_list1564)
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -5653,7 +5653,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end macro_parameter_list
@@ -5666,7 +5666,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 43):
- return
+ return
# C.g:424:2: ( '&' | '*' | '+' | '-' | '~' | '!' )
# C.g:
@@ -5678,7 +5678,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -5700,7 +5700,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end unary_operator
@@ -5811,7 +5811,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 45):
- return
+ return
# C.g:439:5: ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL )
alt72 = 6
@@ -5831,7 +5831,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("438:1: constant : ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL );", 72, 0, self.input)
@@ -5841,28 +5841,28 @@ class CParser(Parser):
# C.g:439:9: HEX_LITERAL
self.match(self.input, HEX_LITERAL, self.FOLLOW_HEX_LITERAL_in_constant1643)
if self.failed:
- return
+ return
elif alt72 == 2:
# C.g:440:9: OCTAL_LITERAL
self.match(self.input, OCTAL_LITERAL, self.FOLLOW_OCTAL_LITERAL_in_constant1653)
if self.failed:
- return
+ return
elif alt72 == 3:
# C.g:441:9: DECIMAL_LITERAL
self.match(self.input, DECIMAL_LITERAL, self.FOLLOW_DECIMAL_LITERAL_in_constant1663)
if self.failed:
- return
+ return
elif alt72 == 4:
# C.g:442:7: CHARACTER_LITERAL
self.match(self.input, CHARACTER_LITERAL, self.FOLLOW_CHARACTER_LITERAL_in_constant1671)
if self.failed:
- return
+ return
elif alt72 == 5:
@@ -5906,7 +5906,7 @@ class CParser(Parser):
# C.g:0:0: IDENTIFIER
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1680)
if self.failed:
- return
+ return
else:
@@ -5932,7 +5932,7 @@ class CParser(Parser):
# C.g:0:0: STRING_LITERAL
self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_constant1683)
if self.failed:
- return
+ return
else:
@@ -5941,7 +5941,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(69, self.input)
raise eee
@@ -5957,7 +5957,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(70, self.input)
raise eee
@@ -5978,7 +5978,7 @@ class CParser(Parser):
# C.g:0:0: IDENTIFIER
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1688)
if self.failed:
- return
+ return
else:
@@ -5991,7 +5991,7 @@ class CParser(Parser):
# C.g:444:9: FLOATING_POINT_LITERAL
self.match(self.input, FLOATING_POINT_LITERAL, self.FOLLOW_FLOATING_POINT_LITERAL_in_constant1699)
if self.failed:
- return
+ return
@@ -6004,7 +6004,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end constant
@@ -6087,7 +6087,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 47):
- return
+ return
# C.g:454:2: ( conditional_expression )
# C.g:454:4: conditional_expression
@@ -6095,7 +6095,7 @@ class CParser(Parser):
self.conditional_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -6109,7 +6109,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end constant_expression
@@ -6122,7 +6122,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 48):
- return
+ return
# C.g:458:2: ( lvalue assignment_operator assignment_expression | conditional_expression )
alt74 = 2
@@ -6139,7 +6139,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 13, self.input)
@@ -6155,7 +6155,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 14, self.input)
@@ -6171,7 +6171,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 15, self.input)
@@ -6187,7 +6187,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 16, self.input)
@@ -6203,7 +6203,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 17, self.input)
@@ -6219,7 +6219,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 18, self.input)
@@ -6235,7 +6235,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 19, self.input)
@@ -6253,7 +6253,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 21, self.input)
@@ -6269,7 +6269,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 22, self.input)
@@ -6280,7 +6280,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 1, self.input)
@@ -6298,7 +6298,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 44, self.input)
@@ -6314,7 +6314,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 45, self.input)
@@ -6330,7 +6330,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 46, self.input)
@@ -6346,7 +6346,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 47, self.input)
@@ -6362,7 +6362,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 48, self.input)
@@ -6378,7 +6378,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 49, self.input)
@@ -6394,7 +6394,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 50, self.input)
@@ -6407,7 +6407,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 2, self.input)
@@ -6425,7 +6425,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 73, self.input)
@@ -6441,7 +6441,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 74, self.input)
@@ -6457,7 +6457,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 75, self.input)
@@ -6473,7 +6473,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 76, self.input)
@@ -6489,7 +6489,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 77, self.input)
@@ -6505,7 +6505,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 78, self.input)
@@ -6521,7 +6521,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 79, self.input)
@@ -6534,7 +6534,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 3, self.input)
@@ -6552,7 +6552,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 102, self.input)
@@ -6568,7 +6568,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 103, self.input)
@@ -6584,7 +6584,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 104, self.input)
@@ -6600,7 +6600,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 105, self.input)
@@ -6616,7 +6616,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 106, self.input)
@@ -6632,7 +6632,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 107, self.input)
@@ -6648,7 +6648,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 108, self.input)
@@ -6661,7 +6661,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 4, self.input)
@@ -6679,7 +6679,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 131, self.input)
@@ -6695,7 +6695,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 132, self.input)
@@ -6711,7 +6711,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 133, self.input)
@@ -6727,7 +6727,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 134, self.input)
@@ -6743,7 +6743,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 135, self.input)
@@ -6759,7 +6759,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 136, self.input)
@@ -6775,7 +6775,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 137, self.input)
@@ -6788,7 +6788,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 5, self.input)
@@ -6806,7 +6806,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 160, self.input)
@@ -6822,7 +6822,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 161, self.input)
@@ -6838,7 +6838,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 162, self.input)
@@ -6854,7 +6854,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 163, self.input)
@@ -6870,7 +6870,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 164, self.input)
@@ -6886,7 +6886,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 165, self.input)
@@ -6902,7 +6902,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 166, self.input)
@@ -6918,7 +6918,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 167, self.input)
@@ -6936,7 +6936,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 189, self.input)
@@ -6947,7 +6947,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 6, self.input)
@@ -6965,7 +6965,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 191, self.input)
@@ -6981,7 +6981,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 192, self.input)
@@ -6997,7 +6997,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 193, self.input)
@@ -7013,7 +7013,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 194, self.input)
@@ -7029,7 +7029,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 195, self.input)
@@ -7045,7 +7045,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 196, self.input)
@@ -7061,7 +7061,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 197, self.input)
@@ -7074,7 +7074,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 7, self.input)
@@ -7092,7 +7092,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 220, self.input)
@@ -7108,7 +7108,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 221, self.input)
@@ -7124,7 +7124,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 222, self.input)
@@ -7140,7 +7140,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 223, self.input)
@@ -7156,7 +7156,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 224, self.input)
@@ -7172,7 +7172,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 225, self.input)
@@ -7188,7 +7188,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 226, self.input)
@@ -7204,7 +7204,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 227, self.input)
@@ -7220,7 +7220,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 228, self.input)
@@ -7236,7 +7236,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 229, self.input)
@@ -7252,7 +7252,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 230, self.input)
@@ -7268,7 +7268,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 231, self.input)
@@ -7279,7 +7279,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 8, self.input)
@@ -7297,7 +7297,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 244, self.input)
@@ -7313,7 +7313,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 245, self.input)
@@ -7329,7 +7329,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 246, self.input)
@@ -7345,7 +7345,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 247, self.input)
@@ -7361,7 +7361,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 248, self.input)
@@ -7377,7 +7377,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 249, self.input)
@@ -7393,7 +7393,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 250, self.input)
@@ -7409,7 +7409,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 251, self.input)
@@ -7425,7 +7425,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 252, self.input)
@@ -7441,7 +7441,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 253, self.input)
@@ -7457,7 +7457,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 254, self.input)
@@ -7473,7 +7473,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 255, self.input)
@@ -7482,7 +7482,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 9, self.input)
@@ -7500,7 +7500,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 256, self.input)
@@ -7516,7 +7516,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 257, self.input)
@@ -7532,7 +7532,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 258, self.input)
@@ -7548,7 +7548,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 259, self.input)
@@ -7564,7 +7564,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 260, self.input)
@@ -7580,7 +7580,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 261, self.input)
@@ -7596,7 +7596,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 262, self.input)
@@ -7612,7 +7612,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 263, self.input)
@@ -7628,7 +7628,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 264, self.input)
@@ -7644,7 +7644,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 265, self.input)
@@ -7660,7 +7660,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 266, self.input)
@@ -7676,7 +7676,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 267, self.input)
@@ -7685,7 +7685,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 10, self.input)
@@ -7703,7 +7703,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 268, self.input)
@@ -7719,7 +7719,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 269, self.input)
@@ -7735,7 +7735,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 270, self.input)
@@ -7751,7 +7751,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 271, self.input)
@@ -7767,7 +7767,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 272, self.input)
@@ -7783,7 +7783,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 273, self.input)
@@ -7799,7 +7799,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 274, self.input)
@@ -7815,7 +7815,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 275, self.input)
@@ -7831,7 +7831,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 276, self.input)
@@ -7847,7 +7847,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 277, self.input)
@@ -7863,7 +7863,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 278, self.input)
@@ -7879,7 +7879,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 279, self.input)
@@ -7888,7 +7888,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 11, self.input)
@@ -7906,7 +7906,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 280, self.input)
@@ -7922,7 +7922,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 281, self.input)
@@ -7938,7 +7938,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 282, self.input)
@@ -7954,7 +7954,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 283, self.input)
@@ -7970,7 +7970,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 284, self.input)
@@ -7986,7 +7986,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 285, self.input)
@@ -8002,7 +8002,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 286, self.input)
@@ -8018,7 +8018,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 287, self.input)
@@ -8034,7 +8034,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 288, self.input)
@@ -8050,7 +8050,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 289, self.input)
@@ -8066,7 +8066,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 290, self.input)
@@ -8082,7 +8082,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 291, self.input)
@@ -8091,7 +8091,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 12, self.input)
@@ -8100,7 +8100,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 0, self.input)
@@ -8112,17 +8112,17 @@ class CParser(Parser):
self.lvalue()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_assignment_operator_in_assignment_expression1746)
self.assignment_operator()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_assignment_expression_in_assignment_expression1748)
self.assignment_expression()
self.following.pop()
if self.failed:
- return
+ return
elif alt74 == 2:
@@ -8131,7 +8131,7 @@ class CParser(Parser):
self.conditional_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -8144,7 +8144,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end assignment_expression
@@ -8157,7 +8157,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 49):
- return
+ return
# C.g:463:2: ( unary_expression )
# C.g:463:4: unary_expression
@@ -8165,7 +8165,7 @@ class CParser(Parser):
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
@@ -8179,7 +8179,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end lvalue
@@ -8192,7 +8192,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 50):
- return
+ return
# C.g:467:2: ( '=' | '*=' | '/=' | '%=' | '+=' | '-=' | '<<=' | '>>=' | '&=' | '^=' | '|=' )
# C.g:
@@ -8204,7 +8204,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -8226,7 +8226,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end assignment_operator
@@ -8242,7 +8242,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 51):
- return
+ return
# C.g:481:2: (e= logical_or_expression ( '?' expression ':' conditional_expression )? )
# C.g:481:4: e= logical_or_expression ( '?' expression ':' conditional_expression )?
@@ -8250,7 +8250,7 @@ class CParser(Parser):
e = self.logical_or_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:481:28: ( '?' expression ':' conditional_expression )?
alt75 = 2
LA75_0 = self.input.LA(1)
@@ -8261,20 +8261,20 @@ class CParser(Parser):
# C.g:481:29: '?' expression ':' conditional_expression
self.match(self.input, 90, self.FOLLOW_90_in_conditional_expression1842)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_conditional_expression1844)
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 47, self.FOLLOW_47_in_conditional_expression1846)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_conditional_expression_in_conditional_expression1848)
self.conditional_expression()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -8294,7 +8294,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end conditional_expression
@@ -8377,7 +8377,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 53):
- return
+ return
# C.g:489:2: ( inclusive_or_expression ( '&&' inclusive_or_expression )* )
# C.g:489:4: inclusive_or_expression ( '&&' inclusive_or_expression )*
@@ -8385,7 +8385,7 @@ class CParser(Parser):
self.inclusive_or_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:489:28: ( '&&' inclusive_or_expression )*
while True: #loop77
alt77 = 2
@@ -8399,12 +8399,12 @@ class CParser(Parser):
# C.g:489:29: '&&' inclusive_or_expression
self.match(self.input, 92, self.FOLLOW_92_in_logical_and_expression1884)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_inclusive_or_expression_in_logical_and_expression1886)
self.inclusive_or_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8424,7 +8424,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end logical_and_expression
@@ -8437,7 +8437,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 54):
- return
+ return
# C.g:493:2: ( exclusive_or_expression ( '|' exclusive_or_expression )* )
# C.g:493:4: exclusive_or_expression ( '|' exclusive_or_expression )*
@@ -8445,7 +8445,7 @@ class CParser(Parser):
self.exclusive_or_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:493:28: ( '|' exclusive_or_expression )*
while True: #loop78
alt78 = 2
@@ -8459,12 +8459,12 @@ class CParser(Parser):
# C.g:493:29: '|' exclusive_or_expression
self.match(self.input, 93, self.FOLLOW_93_in_inclusive_or_expression1902)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904)
self.exclusive_or_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8484,7 +8484,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end inclusive_or_expression
@@ -8497,7 +8497,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 55):
- return
+ return
# C.g:497:2: ( and_expression ( '^' and_expression )* )
# C.g:497:4: and_expression ( '^' and_expression )*
@@ -8505,7 +8505,7 @@ class CParser(Parser):
self.and_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:497:19: ( '^' and_expression )*
while True: #loop79
alt79 = 2
@@ -8519,12 +8519,12 @@ class CParser(Parser):
# C.g:497:20: '^' and_expression
self.match(self.input, 94, self.FOLLOW_94_in_exclusive_or_expression1920)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_and_expression_in_exclusive_or_expression1922)
self.and_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8544,7 +8544,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end exclusive_or_expression
@@ -8557,7 +8557,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 56):
- return
+ return
# C.g:501:2: ( equality_expression ( '&' equality_expression )* )
# C.g:501:4: equality_expression ( '&' equality_expression )*
@@ -8565,7 +8565,7 @@ class CParser(Parser):
self.equality_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:501:24: ( '&' equality_expression )*
while True: #loop80
alt80 = 2
@@ -8579,12 +8579,12 @@ class CParser(Parser):
# C.g:501:25: '&' equality_expression
self.match(self.input, 77, self.FOLLOW_77_in_and_expression1938)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_equality_expression_in_and_expression1940)
self.equality_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8604,7 +8604,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end and_expression
@@ -8617,7 +8617,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 57):
- return
+ return
# C.g:504:2: ( relational_expression ( ( '==' | '!=' ) relational_expression )* )
# C.g:504:4: relational_expression ( ( '==' | '!=' ) relational_expression )*
@@ -8625,7 +8625,7 @@ class CParser(Parser):
self.relational_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:504:26: ( ( '==' | '!=' ) relational_expression )*
while True: #loop81
alt81 = 2
@@ -8645,7 +8645,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -8658,7 +8658,7 @@ class CParser(Parser):
self.relational_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8678,7 +8678,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end equality_expression
@@ -8691,7 +8691,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 58):
- return
+ return
# C.g:508:2: ( shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )* )
# C.g:508:4: shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )*
@@ -8699,7 +8699,7 @@ class CParser(Parser):
self.shift_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:508:21: ( ( '<' | '>' | '<=' | '>=' ) shift_expression )*
while True: #loop82
alt82 = 2
@@ -8719,7 +8719,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -8732,7 +8732,7 @@ class CParser(Parser):
self.shift_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8752,7 +8752,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end relational_expression
@@ -8765,7 +8765,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 59):
- return
+ return
# C.g:512:2: ( additive_expression ( ( '<<' | '>>' ) additive_expression )* )
# C.g:512:4: additive_expression ( ( '<<' | '>>' ) additive_expression )*
@@ -8773,7 +8773,7 @@ class CParser(Parser):
self.additive_expression()
self.following.pop()
if self.failed:
- return
+ return
# C.g:512:24: ( ( '<<' | '>>' ) additive_expression )*
while True: #loop83
alt83 = 2
@@ -8793,7 +8793,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -8806,7 +8806,7 @@ class CParser(Parser):
self.additive_expression()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -8826,7 +8826,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end shift_expression
@@ -8839,7 +8839,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 60):
- return
+ return
# C.g:518:2: ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration )
alt84 = 11
@@ -8860,7 +8860,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 43, self.input)
@@ -8880,7 +8880,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 47, self.input)
@@ -8896,7 +8896,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 53, self.input)
@@ -8912,7 +8912,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 68, self.input)
@@ -8923,7 +8923,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 1, self.input)
@@ -8952,7 +8952,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 0, self.input)
@@ -8964,7 +8964,7 @@ class CParser(Parser):
self.labeled_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 2:
@@ -8973,7 +8973,7 @@ class CParser(Parser):
self.compound_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 3:
@@ -8982,7 +8982,7 @@ class CParser(Parser):
self.expression_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 4:
@@ -8991,7 +8991,7 @@ class CParser(Parser):
self.selection_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 5:
@@ -9000,7 +9000,7 @@ class CParser(Parser):
self.iteration_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 6:
@@ -9009,7 +9009,7 @@ class CParser(Parser):
self.jump_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 7:
@@ -9018,7 +9018,7 @@ class CParser(Parser):
self.macro_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 8:
@@ -9027,7 +9027,7 @@ class CParser(Parser):
self.asm2_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 9:
@@ -9036,7 +9036,7 @@ class CParser(Parser):
self.asm1_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 10:
@@ -9045,7 +9045,7 @@ class CParser(Parser):
self.asm_statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt84 == 11:
@@ -9054,7 +9054,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
@@ -9067,7 +9067,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end statement
@@ -9080,7 +9080,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 61):
- return
+ return
# C.g:532:2: ( ( '__asm__' )? IDENTIFIER '(' (~ ( ';' ) )* ')' ';' )
# C.g:532:4: ( '__asm__' )? IDENTIFIER '(' (~ ( ';' ) )* ')' ';'
@@ -9094,16 +9094,16 @@ class CParser(Parser):
# C.g:0:0: '__asm__'
self.match(self.input, 103, self.FOLLOW_103_in_asm2_statement2086)
if self.failed:
- return
+ return
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_asm2_statement2089)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_asm2_statement2091)
if self.failed:
- return
+ return
# C.g:532:30: (~ ( ';' ) )*
while True: #loop86
alt86 = 2
@@ -9130,7 +9130,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -9147,10 +9147,10 @@ class CParser(Parser):
self.match(self.input, 63, self.FOLLOW_63_in_asm2_statement2101)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_asm2_statement2103)
if self.failed:
- return
+ return
@@ -9164,7 +9164,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end asm2_statement
@@ -9177,16 +9177,16 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 62):
- return
+ return
# C.g:536:2: ( '_asm' '{' (~ ( '}' ) )* '}' )
# C.g:536:4: '_asm' '{' (~ ( '}' ) )* '}'
self.match(self.input, 104, self.FOLLOW_104_in_asm1_statement2115)
if self.failed:
- return
+ return
self.match(self.input, 43, self.FOLLOW_43_in_asm1_statement2117)
if self.failed:
- return
+ return
# C.g:536:15: (~ ( '}' ) )*
while True: #loop87
alt87 = 2
@@ -9206,7 +9206,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -9223,7 +9223,7 @@ class CParser(Parser):
self.match(self.input, 44, self.FOLLOW_44_in_asm1_statement2127)
if self.failed:
- return
+ return
@@ -9237,7 +9237,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end asm1_statement
@@ -9250,16 +9250,16 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 63):
- return
+ return
# C.g:540:2: ( '__asm' '{' (~ ( '}' ) )* '}' )
# C.g:540:4: '__asm' '{' (~ ( '}' ) )* '}'
self.match(self.input, 105, self.FOLLOW_105_in_asm_statement2138)
if self.failed:
- return
+ return
self.match(self.input, 43, self.FOLLOW_43_in_asm_statement2140)
if self.failed:
- return
+ return
# C.g:540:16: (~ ( '}' ) )*
while True: #loop88
alt88 = 2
@@ -9279,7 +9279,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
mse = MismatchedSetException(None, self.input)
self.recoverFromMismatchedSet(
@@ -9296,7 +9296,7 @@ class CParser(Parser):
self.match(self.input, 44, self.FOLLOW_44_in_asm_statement2150)
if self.failed:
- return
+ return
@@ -9310,7 +9310,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end asm_statement
@@ -9323,16 +9323,16 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 64):
- return
+ return
# C.g:544:2: ( IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')' )
# C.g:544:4: IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')'
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_macro_statement2162)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_macro_statement2164)
if self.failed:
- return
+ return
# C.g:544:19: ( declaration )*
while True: #loop89
alt89 = 2
@@ -11234,7 +11234,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -12440,7 +12440,7 @@ class CParser(Parser):
self.statement_list()
self.following.pop()
if self.failed:
- return
+ return
@@ -12456,13 +12456,13 @@ class CParser(Parser):
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_macro_statement2176)
if self.failed:
- return
+ return
@@ -12476,7 +12476,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end macro_statement
@@ -12489,7 +12489,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 65):
- return
+ return
# C.g:548:2: ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement )
alt92 = 3
@@ -12503,7 +12503,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("547:1: labeled_statement : ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement );", 92, 0, self.input)
@@ -12513,50 +12513,50 @@ class CParser(Parser):
# C.g:548:4: IDENTIFIER ':' statement
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_labeled_statement2188)
if self.failed:
- return
+ return
self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2190)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_labeled_statement2192)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt92 == 2:
# C.g:549:4: 'case' constant_expression ':' statement
self.match(self.input, 106, self.FOLLOW_106_in_labeled_statement2197)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_constant_expression_in_labeled_statement2199)
self.constant_expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2201)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_labeled_statement2203)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
elif alt92 == 3:
# C.g:550:4: 'default' ':' statement
self.match(self.input, 107, self.FOLLOW_107_in_labeled_statement2208)
if self.failed:
- return
+ return
self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2210)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_labeled_statement2212)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
@@ -12569,7 +12569,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end labeled_statement
@@ -14552,7 +14552,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 67):
- return
+ return
# C.g:558:2: ( ( statement )+ )
# C.g:558:4: ( statement )+
@@ -16230,7 +16230,7 @@ class CParser(Parser):
self.statement()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -16239,7 +16239,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(95, self.input)
raise eee
@@ -16260,7 +16260,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end statement_list
@@ -16347,7 +16347,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 69):
- return
+ return
# C.g:567:2: ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement )
alt98 = 2
@@ -16360,7 +16360,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("566:1: selection_statement : ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement );", 98, 0, self.input)
@@ -16370,18 +16370,18 @@ class CParser(Parser):
# C.g:567:4: 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )?
self.match(self.input, 108, self.FOLLOW_108_in_selection_statement2272)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_selection_statement2274)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_selection_statement2278)
e = self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_selection_statement2280)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -16389,7 +16389,7 @@ class CParser(Parser):
self.statement()
self.following.pop()
if self.failed:
- return
+ return
# C.g:567:167: ( options {k=1; backtrack=false; } : 'else' statement )?
alt97 = 2
LA97_0 = self.input.LA(1)
@@ -16400,12 +16400,12 @@ class CParser(Parser):
# C.g:567:200: 'else' statement
self.match(self.input, 109, self.FOLLOW_109_in_selection_statement2299)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_selection_statement2301)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
@@ -16415,23 +16415,23 @@ class CParser(Parser):
# C.g:568:4: 'switch' '(' expression ')' statement
self.match(self.input, 110, self.FOLLOW_110_in_selection_statement2308)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_selection_statement2310)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_selection_statement2312)
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_selection_statement2314)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_selection_statement2316)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
@@ -16444,7 +16444,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end selection_statement
@@ -16460,7 +16460,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 70):
- return
+ return
# C.g:572:2: ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement )
alt100 = 3
@@ -16474,7 +16474,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("571:1: iteration_statement : ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement );", 100, 0, self.input)
@@ -16484,23 +16484,23 @@ class CParser(Parser):
# C.g:572:4: 'while' '(' e= expression ')' statement
self.match(self.input, 111, self.FOLLOW_111_in_iteration_statement2327)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2329)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_iteration_statement2333)
e = self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2335)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_iteration_statement2337)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -16510,29 +16510,29 @@ class CParser(Parser):
# C.g:573:4: 'do' statement 'while' '(' e= expression ')' ';'
self.match(self.input, 112, self.FOLLOW_112_in_iteration_statement2344)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_iteration_statement2346)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 111, self.FOLLOW_111_in_iteration_statement2348)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2350)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_iteration_statement2354)
e = self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2356)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_iteration_statement2358)
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -16542,20 +16542,20 @@ class CParser(Parser):
# C.g:574:4: 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement
self.match(self.input, 113, self.FOLLOW_113_in_iteration_statement2365)
if self.failed:
- return
+ return
self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2367)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_statement_in_iteration_statement2369)
self.expression_statement()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_statement_in_iteration_statement2373)
e = self.expression_statement()
self.following.pop()
if self.failed:
- return
+ return
# C.g:574:58: ( expression )?
alt99 = 2
LA99_0 = self.input.LA(1)
@@ -16568,18 +16568,18 @@ class CParser(Parser):
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2378)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_statement_in_iteration_statement2380)
self.statement()
self.following.pop()
if self.failed:
- return
+ return
if self.backtracking == 0:
self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
@@ -16595,7 +16595,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end iteration_statement
@@ -16608,7 +16608,7 @@ class CParser(Parser):
try:
try:
if self.backtracking > 0 and self.alreadyParsedRule(self.input, 71):
- return
+ return
# C.g:578:2: ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' )
alt101 = 5
@@ -16629,7 +16629,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 4, self.input)
@@ -16638,7 +16638,7 @@ class CParser(Parser):
else:
if self.backtracking > 0:
self.failed = True
- return
+ return
nvae = NoViableAltException("577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 0, self.input)
@@ -16648,58 +16648,58 @@ class CParser(Parser):
# C.g:578:4: 'goto' IDENTIFIER ';'
self.match(self.input, 114, self.FOLLOW_114_in_jump_statement2393)
if self.failed:
- return
+ return
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_jump_statement2395)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2397)
if self.failed:
- return
+ return
elif alt101 == 2:
# C.g:579:4: 'continue' ';'
self.match(self.input, 115, self.FOLLOW_115_in_jump_statement2402)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2404)
if self.failed:
- return
+ return
elif alt101 == 3:
# C.g:580:4: 'break' ';'
self.match(self.input, 116, self.FOLLOW_116_in_jump_statement2409)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2411)
if self.failed:
- return
+ return
elif alt101 == 4:
# C.g:581:4: 'return' ';'
self.match(self.input, 117, self.FOLLOW_117_in_jump_statement2416)
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2418)
if self.failed:
- return
+ return
elif alt101 == 5:
# C.g:582:4: 'return' expression ';'
self.match(self.input, 117, self.FOLLOW_117_in_jump_statement2423)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_expression_in_jump_statement2425)
self.expression()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2427)
if self.failed:
- return
+ return
@@ -16712,7 +16712,7 @@ class CParser(Parser):
pass
- return
+ return
# $ANTLR end jump_statement
@@ -16724,7 +16724,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred2
@@ -16855,7 +16855,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
@@ -16863,7 +16863,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# C.g:119:41: ( declaration )*
while True: #loop103
alt103 = 2
@@ -16879,7 +16879,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -16888,7 +16888,7 @@ class CParser(Parser):
self.match(self.input, 43, self.FOLLOW_43_in_synpred4108)
if self.failed:
- return
+ return
# $ANTLR end synpred4
@@ -16903,7 +16903,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred5
@@ -16918,7 +16918,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred7
@@ -16933,7 +16933,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred10
@@ -16948,7 +16948,7 @@ class CParser(Parser):
self.type_specifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred14
@@ -16963,7 +16963,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred15
@@ -16978,7 +16978,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred33
@@ -16991,7 +16991,7 @@ class CParser(Parser):
# C.g:225:5: IDENTIFIER ( type_qualifier )* declarator
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred34442)
if self.failed:
- return
+ return
# C.g:225:16: ( type_qualifier )*
while True: #loop106
alt106 = 2
@@ -17026,7 +17026,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -17037,7 +17037,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred34
@@ -17052,7 +17052,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred39
@@ -17067,7 +17067,7 @@ class CParser(Parser):
self.type_specifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred40
@@ -17090,7 +17090,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
@@ -17104,7 +17104,7 @@ class CParser(Parser):
# C.g:297:14: 'EFIAPI'
self.match(self.input, 58, self.FOLLOW_58_in_synpred66788)
if self.failed:
- return
+ return
@@ -17118,7 +17118,7 @@ class CParser(Parser):
# C.g:297:26: 'EFI_BOOTSERVICE'
self.match(self.input, 59, self.FOLLOW_59_in_synpred66793)
if self.failed:
- return
+ return
@@ -17132,7 +17132,7 @@ class CParser(Parser):
# C.g:297:47: 'EFI_RUNTIMESERVICE'
self.match(self.input, 60, self.FOLLOW_60_in_synpred66798)
if self.failed:
- return
+ return
@@ -17140,7 +17140,7 @@ class CParser(Parser):
self.direct_declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred66
@@ -17155,7 +17155,7 @@ class CParser(Parser):
self.declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred67
@@ -17168,7 +17168,7 @@ class CParser(Parser):
# C.g:304:9: 'EFIAPI'
self.match(self.input, 58, self.FOLLOW_58_in_synpred69830)
if self.failed:
- return
+ return
# $ANTLR end synpred69
@@ -17183,7 +17183,7 @@ class CParser(Parser):
self.declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred70
@@ -17196,15 +17196,15 @@ class CParser(Parser):
# C.g:310:9: '(' parameter_type_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred73878)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_parameter_type_list_in_synpred73880)
self.parameter_type_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred73882)
if self.failed:
- return
+ return
# $ANTLR end synpred73
@@ -17217,15 +17217,15 @@ class CParser(Parser):
# C.g:311:9: '(' identifier_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred74892)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_identifier_list_in_synpred74894)
self.identifier_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred74896)
if self.failed:
- return
+ return
# $ANTLR end synpred74
@@ -17240,7 +17240,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred75
@@ -17255,7 +17255,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred76
@@ -17268,7 +17268,7 @@ class CParser(Parser):
# C.g:316:4: '*' ( type_qualifier )+ ( pointer )?
self.match(self.input, 66, self.FOLLOW_66_in_synpred77919)
if self.failed:
- return
+ return
# C.g:316:8: ( type_qualifier )+
cnt116 = 0
while True: #loop116
@@ -17285,7 +17285,7 @@ class CParser(Parser):
self.type_qualifier()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -17294,7 +17294,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(116, self.input)
raise eee
@@ -17314,7 +17314,7 @@ class CParser(Parser):
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
@@ -17330,12 +17330,12 @@ class CParser(Parser):
# C.g:317:4: '*' pointer
self.match(self.input, 66, self.FOLLOW_66_in_synpred78930)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_pointer_in_synpred78932)
self.pointer()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred78
@@ -17348,7 +17348,7 @@ class CParser(Parser):
# C.g:326:32: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_synpred81977)
if self.failed:
- return
+ return
# $ANTLR end synpred81
@@ -17361,7 +17361,7 @@ class CParser(Parser):
# C.g:326:27: ',' ( 'OPTIONAL' )? parameter_declaration
self.match(self.input, 27, self.FOLLOW_27_in_synpred82974)
if self.failed:
- return
+ return
# C.g:326:31: ( 'OPTIONAL' )?
alt119 = 2
LA119_0 = self.input.LA(1)
@@ -17375,7 +17375,7 @@ class CParser(Parser):
# C.g:326:32: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_synpred82977)
if self.failed:
- return
+ return
@@ -17383,7 +17383,7 @@ class CParser(Parser):
self.parameter_declaration()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred82
@@ -17398,7 +17398,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred83
@@ -17413,7 +17413,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred84
@@ -17428,7 +17428,7 @@ class CParser(Parser):
self.declaration_specifiers()
self.following.pop()
if self.failed:
- return
+ return
# C.g:330:27: ( declarator | abstract_declarator )*
while True: #loop120
alt120 = 3
@@ -17512,7 +17512,7 @@ class CParser(Parser):
self.declarator()
self.following.pop()
if self.failed:
- return
+ return
elif alt120 == 2:
@@ -17521,7 +17521,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
else:
@@ -17538,7 +17538,7 @@ class CParser(Parser):
# C.g:330:62: 'OPTIONAL'
self.match(self.input, 53, self.FOLLOW_53_in_synpred861004)
if self.failed:
- return
+ return
@@ -17556,7 +17556,7 @@ class CParser(Parser):
self.specifier_qualifier_list()
self.following.pop()
if self.failed:
- return
+ return
# C.g:341:29: ( abstract_declarator )?
alt122 = 2
LA122_0 = self.input.LA(1)
@@ -17569,7 +17569,7 @@ class CParser(Parser):
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
@@ -17587,7 +17587,7 @@ class CParser(Parser):
self.direct_abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred91
@@ -17600,15 +17600,15 @@ class CParser(Parser):
# C.g:351:6: '(' abstract_declarator ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred931086)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_abstract_declarator_in_synpred931088)
self.abstract_declarator()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred931090)
if self.failed:
- return
+ return
# $ANTLR end synpred93
@@ -17623,7 +17623,7 @@ class CParser(Parser):
self.abstract_declarator_suffix()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred94
@@ -17636,20 +17636,20 @@ class CParser(Parser):
# C.g:386:4: '(' type_name ')' cast_expression
self.match(self.input, 62, self.FOLLOW_62_in_synpred1091282)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_type_name_in_synpred1091284)
self.type_name()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred1091286)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_cast_expression_in_synpred1091288)
self.cast_expression()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred109
@@ -17662,12 +17662,12 @@ class CParser(Parser):
# C.g:395:4: 'sizeof' unary_expression
self.match(self.input, 74, self.FOLLOW_74_in_synpred1141330)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_unary_expression_in_synpred1141332)
self.unary_expression()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred114
@@ -17680,15 +17680,15 @@ class CParser(Parser):
# C.g:409:13: '(' argument_expression_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred1171420)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_argument_expression_list_in_synpred1171424)
self.argument_expression_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred1171428)
if self.failed:
- return
+ return
# $ANTLR end synpred117
@@ -17701,15 +17701,15 @@ class CParser(Parser):
# C.g:410:13: '(' macro_parameter_list ')'
self.match(self.input, 62, self.FOLLOW_62_in_synpred1181444)
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_macro_parameter_list_in_synpred1181446)
self.macro_parameter_list()
self.following.pop()
if self.failed:
- return
+ return
self.match(self.input, 63, self.FOLLOW_63_in_synpred1181448)
if self.failed:
- return
+ return
# $ANTLR end synpred118
@@ -17722,10 +17722,10 @@ class CParser(Parser):
# C.g:412:13: '*' IDENTIFIER
self.match(self.input, 66, self.FOLLOW_66_in_synpred1201482)
if self.failed:
- return
+ return
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred1201486)
if self.failed:
- return
+ return
# $ANTLR end synpred120
@@ -17738,7 +17738,7 @@ class CParser(Parser):
# C.g:443:20: STRING_LITERAL
self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_synpred1371683)
if self.failed:
- return
+ return
# $ANTLR end synpred137
@@ -17762,7 +17762,7 @@ class CParser(Parser):
# C.g:0:0: IDENTIFIER
self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred1381680)
if self.failed:
- return
+ return
else:
@@ -17783,7 +17783,7 @@ class CParser(Parser):
# C.g:0:0: STRING_LITERAL
self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_synpred1381683)
if self.failed:
- return
+ return
else:
@@ -17792,7 +17792,7 @@ class CParser(Parser):
if self.backtracking > 0:
self.failed = True
- return
+ return
eee = EarlyExitException(126, self.input)
raise eee
@@ -17814,17 +17814,17 @@ class CParser(Parser):
self.lvalue()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_assignment_operator_in_synpred1421746)
self.assignment_operator()
self.following.pop()
if self.failed:
- return
+ return
self.following.append(self.FOLLOW_assignment_expression_in_synpred1421748)
self.assignment_expression()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred142
@@ -17839,7 +17839,7 @@ class CParser(Parser):
self.expression_statement()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred169
@@ -17854,7 +17854,7 @@ class CParser(Parser):
self.macro_statement()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred173
@@ -17869,7 +17869,7 @@ class CParser(Parser):
self.asm2_statement()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred174
@@ -17884,7 +17884,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred181
@@ -17899,7 +17899,7 @@ class CParser(Parser):
self.statement_list()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred182
@@ -17914,7 +17914,7 @@ class CParser(Parser):
self.declaration()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred186
@@ -17929,7 +17929,7 @@ class CParser(Parser):
self.statement()
self.following.pop()
if self.failed:
- return
+ return
# $ANTLR end synpred188
@@ -18388,7 +18388,7 @@ class CParser(Parser):
-
+
FOLLOW_external_declaration_in_translation_unit74 = frozenset([1, 4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
FOLLOW_function_definition_in_external_declaration113 = frozenset([1])
diff --git a/BaseTools/Source/Python/Eot/Eot.py b/BaseTools/Source/Python/Eot/Eot.py
index fcde8fd3e22f..712fb3cd17c9 100644
--- a/BaseTools/Source/Python/Eot/Eot.py
+++ b/BaseTools/Source/Python/Eot/Eot.py
@@ -51,7 +51,7 @@ class MultipleFv(FirmwareVolume):
Fv.frombuffer(Buf, 0, len(Buf))
self.BasicInfo.append([Fv.Name, Fv.FileSystemGuid, Fv.Size])
- self.FfsDict.append(Fv.FfsDict)
+ self.FfsDict.append(Fv.FfsDict)
## Class Eot
#
@@ -82,7 +82,7 @@ class Eot(object):
self.FvFileList = FvFileList
self.MapFileList = MapFileList
self.Dispatch = Dispatch
-
+
# Check workspace environment
if "EFI_SOURCE" not in os.environ:
if "EDK_SOURCE" not in os.environ:
@@ -122,13 +122,13 @@ class Eot(object):
if not os.path.isfile(MapFile):
EdkLogger.error("Eot", EdkLogger.EOT_ERROR, "Can not find file %s " % MapFile)
EotGlobalData.gMAP_FILE.append(MapFile)
-
+
# Generate source file list
self.GenerateSourceFileList(self.SourceFileList, self.IncludeDirList)
# Generate guid list of dec file list
self.ParseDecFile(self.DecFileList)
-
+
# Generate guid list from GUID list file
self.ParseGuidList(self.GuidList)
@@ -188,7 +188,7 @@ class Eot(object):
if len(list) == 2:
EotGlobalData.gGuidDict[list[0].strip()] = GuidStructureStringToGuidString(list[1].strip())
-
+
## ParseGuidList() method
#
# Parse Guid list and get all GUID names with GUID values as {GuidName : GuidValue}
@@ -203,7 +203,7 @@ class Eot(object):
for Line in open(Path):
(GuidName, GuidValue) = Line.split()
EotGlobalData.gGuidDict[GuidName] = GuidValue
-
+
## ConvertLogFile() method
#
# Parse a real running log file to get real dispatch order
@@ -557,7 +557,7 @@ class Eot(object):
if Options.FvFileList:
self.FvFileList = Options.FvFileList
-
+
if Options.MapFileList:
self.MapFileList = Options.FvMapFileList
@@ -569,7 +569,7 @@ class Eot(object):
if Options.DecFileList:
self.DecFileList = Options.DecFileList
-
+
if Options.GuidList:
self.GuidList = Options.GuidList
diff --git a/BaseTools/Source/Python/Eot/Report.py b/BaseTools/Source/Python/Eot/Report.py
index 7435b4d7c930..d631c036bad0 100644
--- a/BaseTools/Source/Python/Eot/Report.py
+++ b/BaseTools/Source/Python/Eot/Report.py
@@ -276,13 +276,13 @@ class Report(object):
</tr>
<tr id='Ffs%s' style='display:none;'>
<td colspan="4"><table width="100%%" border="1">""" % (self.FfsIndex, self.FfsIndex, self.FfsIndex, FfsPath, FfsName, FfsGuid, FfsOffset, FfsType, self.FfsIndex)
-
+
if self.DispatchList:
if FfsObj.Type in [0x04, 0x06]:
self.DispatchList.write("%s %s %s %s\n" % (FfsGuid, "P", FfsName, FfsPath))
if FfsObj.Type in [0x05, 0x07, 0x08, 0x0A]:
self.DispatchList.write("%s %s %s %s\n" % (FfsGuid, "D", FfsName, FfsPath))
-
+
self.WriteLn(Content)
EotGlobalData.gOP_DISPATCH_ORDER.write('%s\n' %FfsName)
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index fbd48f3c6d76..e2c8e388c50b 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -205,7 +205,7 @@ class Capsule (CapsuleClassObject) :
return GenFds.ImageBinDict[self.UiCapsuleName.upper() + 'cap']
GenFdsGlobalVariable.InfLogger( "\nGenerate %s Capsule" %self.UiCapsuleName)
- if ('CAPSULE_GUID' in self.TokensDict and
+ if ('CAPSULE_GUID' in self.TokensDict and
uuid.UUID(self.TokensDict['CAPSULE_GUID']) == uuid.UUID('6DCBD5ED-E82D-4C44-BDA1-7194199AD92A')):
return self.GenFmpCapsule()
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index b376d6b2e9be..9766a2c984a1 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -32,13 +32,13 @@ class CapsuleData:
# @param self The object pointer
def __init__(self):
pass
-
+
## generate capsule data
#
# @param self The object pointer
def GenCapsuleSubItem(self):
pass
-
+
## FFS class for capsule data
#
#
@@ -119,7 +119,7 @@ class CapsuleFd (CapsuleData):
else:
FdFile = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FdName)
return FdFile
-
+
## AnyFile class for capsule data
#
#
@@ -139,7 +139,7 @@ class CapsuleAnyFile (CapsuleData):
#
def GenCapsuleSubItem(self):
return self.FileName
-
+
## Afile class for capsule data
#
#
@@ -208,11 +208,11 @@ class CapsulePayload(CapsuleData):
Guid = self.ImageTypeId.split('-')
Buffer = pack('=ILHHBBBBBBBBBBBBIIQ',
int(self.Version,16),
- int(Guid[0], 16),
- int(Guid[1], 16),
- int(Guid[2], 16),
- int(Guid[3][-4:-2], 16),
- int(Guid[3][-2:], 16),
+ int(Guid[0], 16),
+ int(Guid[1], 16),
+ int(Guid[2], 16),
+ int(Guid[3][-4:-2], 16),
+ int(Guid[3][-2:], 16),
int(Guid[4][-12:-10], 16),
int(Guid[4][-10:-8], 16),
int(Guid[4][-8:-6], 16),
diff --git a/BaseTools/Source/Python/GenFds/EfiSection.py b/BaseTools/Source/Python/GenFds/EfiSection.py
index 5bb1ae6f664c..635070365b06 100644
--- a/BaseTools/Source/Python/GenFds/EfiSection.py
+++ b/BaseTools/Source/Python/GenFds/EfiSection.py
@@ -54,7 +54,7 @@ class EfiSection (EfiSectionClassObject):
# @retval tuple (Generated file name list, section alignment)
#
def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf = None, Dict = {}, IsMakefile = False) :
-
+
if self.FileName is not None and self.FileName.startswith('PCD('):
self.FileName = GenFdsGlobalVariable.GetPcdValue(self.FileName)
"""Prepare the parameter of GenSection"""
@@ -154,7 +154,7 @@ class EfiSection (EfiSectionClassObject):
BuildNumTuple = tuple()
BuildNumString = ' ' + ' '.join(BuildNumTuple)
- #if VerString == '' and
+ #if VerString == '' and
if BuildNumString == '':
if self.Optional == True :
GenFdsGlobalVariable.VerboseLogger( "Optional Section don't exist!")
@@ -239,7 +239,7 @@ class EfiSection (EfiSectionClassObject):
Num = '%s.%d' %(SecNum , Index)
OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + Num + Ffs.SectionSuffix.get(SectionType))
File = GenFdsGlobalVariable.MacroExtend(File, Dict)
-
+
#Get PE Section alignment when align is set to AUTO
if self.Alignment == 'Auto' and (SectionType == 'PE32' or SectionType == 'TE'):
ImageObj = PeImageClass (File)
@@ -283,7 +283,7 @@ class EfiSection (EfiSectionClassObject):
IsMakefile = IsMakefile
)
File = StrippedFile
-
+
"""For TE Section call GenFw to generate TE image"""
if SectionType == 'TE':
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index cc4124ad902e..3a90a72157e5 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -102,7 +102,7 @@ class FD(FDClassObject):
pass
GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
RegionObj.AddToBuffer (TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
-
+
FdBuffer = StringIO.StringIO('')
PreviousRegionStart = -1
PreviousRegionSize = 1
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 80ff3ece43b4..29da68e14ff8 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -172,7 +172,7 @@ class IncludeFileProfile :
self.InsertAdjust = 0
self.IncludeFileList = []
self.Level = 1 # first level include file
-
+
def GetTotalLines(self):
TotalLines = self.InsertAdjust + len(self.FileLinesList)
@@ -190,7 +190,7 @@ class IncludeFileProfile :
def GetLineInFile(self, Line):
if not self.IsLineInFile (Line):
return (self.FileName, -1)
-
+
InsertedLines = self.InsertStartLineNumber
for Profile in self.IncludeFileList:
@@ -232,7 +232,7 @@ class FileProfile :
# ECC will use this Dict and List information
self.PcdFileLineDict = {}
self.InfFileLineList = []
-
+
self.FdDict = {}
self.FdNameNotSet = False
self.FvDict = {}
@@ -338,11 +338,11 @@ class FdfParser:
#
# @param self The object pointer
# @param DestLine Optional new destination line number.
- # @param DestOffset Optional new destination offset.
+ # @param DestOffset Optional new destination offset.
#
- def Rewind(self, DestLine = 1, DestOffset = 0):
- self.CurrentLineNumber = DestLine
- self.CurrentOffsetWithinLine = DestOffset
+ def Rewind(self, DestLine = 1, DestOffset = 0):
+ self.CurrentLineNumber = DestLine
+ self.CurrentOffsetWithinLine = DestOffset
## __UndoOneChar() method
#
@@ -458,7 +458,7 @@ class FdfParser:
if MacroName.startswith('!'):
NotFlag = True
MacroName = MacroName[1:].strip()
-
+
if not MacroName.startswith('$(') or not MacroName.endswith(')'):
raise Warning("Macro name expected(Please use '$(%(Token)s)' if '%(Token)s' is a macro.)" % {"Token" : MacroName},
self.FileName, self.CurrentLineNumber)
@@ -663,7 +663,7 @@ class FdfParser:
IncludedFile1 = PathClass(IncludedFile, GlobalData.gWorkspace)
ErrorCode = IncludedFile1.Validate()[0]
if ErrorCode != 0:
- raise Warning("The include file does not exist under below directories: \n%s\n%s\n%s\n"%(os.path.dirname(self.FileName), PlatformDir, GlobalData.gWorkspace),
+ raise Warning("The include file does not exist under below directories: \n%s\n%s\n%s\n"%(os.path.dirname(self.FileName), PlatformDir, GlobalData.gWorkspace),
self.FileName, self.CurrentLineNumber)
if not IsValidInclude (IncludedFile1.Path, self.CurrentLineNumber):
@@ -706,18 +706,18 @@ class FdfParser:
Processed = False
# Preprocess done.
self.Rewind()
-
+
@staticmethod
def __GetIfListCurrentItemStat(IfList):
if len(IfList) == 0:
return True
-
+
for Item in IfList:
if Item[1] == False:
return False
-
+
return True
-
+
## PreprocessConditionalStatement() method
#
# Preprocess conditional statement.
@@ -777,7 +777,7 @@ class FdfParser:
Macro = self.__Token
if not self.__IsToken( "="):
raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
-
+
Value = self.__GetExpression()
self.__SetMacroValue(Macro, Value)
self.__WipeOffArea.append(((DefineLine, DefineOffset), (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
@@ -807,7 +807,7 @@ class FdfParser:
CondLabel = self.__Token
Expression = self.__GetExpression()
-
+
if CondLabel == '!if':
ConditionSatisfied = self.__EvaluateConditional(Expression, IfList[-1][0][0] + 1, 'eval')
else:
@@ -818,7 +818,7 @@ class FdfParser:
BranchDetermined = ConditionSatisfied
IfList[-1] = [IfList[-1][0], ConditionSatisfied, BranchDetermined]
if ConditionSatisfied:
- self.__WipeOffArea.append((IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
+ self.__WipeOffArea.append((IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
elif self.__Token in ('!elseif', '!else'):
ElseStartPos = (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - len(self.__Token))
if len(IfList) <= 0:
@@ -890,7 +890,7 @@ class FdfParser:
ScopeMacro = self.__MacroDict[TAB_COMMON, TAB_COMMON, TAB_COMMON]
if ScopeMacro:
MacroDict.update(ScopeMacro)
-
+
# Section macro
ScopeMacro = self.__MacroDict[
self.__CurSection[0],
@@ -923,12 +923,12 @@ class FdfParser:
else:
return ValueExpression(Expression, MacroPcdDict)()
except WrnExpression, Excpt:
- #
+ #
# Catch expression evaluation warning here. We need to report
# the precise number of line and return the evaluation result
#
EdkLogger.warn('Parser', "Suspicious expression: %s" % str(Excpt),
- File=self.FileName, ExtraData=self.__CurrentLine(),
+ File=self.FileName, ExtraData=self.__CurrentLine(),
Line=Line)
return Excpt.result
except Exception, Excpt:
@@ -947,7 +947,7 @@ class FdfParser:
raise Warning(str(Excpt), *FileLineTuple)
else:
if Expression.startswith('$(') and Expression[-1] == ')':
- Expression = Expression[2:-1]
+ Expression = Expression[2:-1]
return Expression in MacroPcdDict
## __IsToken() method
@@ -1431,9 +1431,9 @@ class FdfParser:
self.__UndoToken()
self.__GetSetStatement(None)
continue
-
+
Macro = self.__Token
-
+
if not self.__IsToken("="):
raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
if not self.__GetNextToken() or self.__Token.startswith('['):
@@ -1480,7 +1480,7 @@ class FdfParser:
else:
raise Warning("expected FdName in [FD.] section", self.FileName, self.CurrentLineNumber)
self.CurrentFdName = FdName.upper()
-
+
if self.CurrentFdName in self.Profile.FdDict:
raise Warning("Unexpected the same FD name", self.FileName, self.CurrentLineNumber)
@@ -1566,12 +1566,12 @@ class FdfParser:
if self.__IsKeyword( "BaseAddress"):
if not self.__IsToken( "="):
raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
-
+
if not self.__GetNextHexNumber():
raise Warning("expected Hex base address", self.FileName, self.CurrentLineNumber)
-
+
Obj.BaseAddress = self.__Token
-
+
if self.__IsToken( "|"):
pcdPair = self.__GetNextPcdName()
Obj.BaseAddressPcd = pcdPair
@@ -1583,7 +1583,7 @@ class FdfParser:
if self.__IsKeyword( "Size"):
if not self.__IsToken( "="):
raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
-
+
if not self.__GetNextHexNumber():
raise Warning("expected Hex size", self.FileName, self.CurrentLineNumber)
@@ -1600,13 +1600,13 @@ class FdfParser:
if self.__IsKeyword( "ErasePolarity"):
if not self.__IsToken( "="):
raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
-
+
if not self.__GetNextToken():
raise Warning("expected Erase Polarity", self.FileName, self.CurrentLineNumber)
-
+
if self.__Token != "1" and self.__Token != "0":
raise Warning("expected 1 or 0 Erase Polarity", self.FileName, self.CurrentLineNumber)
-
+
Obj.ErasePolarity = self.__Token
return True
@@ -1654,7 +1654,7 @@ class FdfParser:
IsBlock = False
while self.__GetBlockStatement(Obj):
IsBlock = True
-
+
Item = Obj.BlockSizeList[-1]
if Item[0] is None or Item[1] is None:
raise Warning("expected block statement", self.FileName, self.CurrentLineNumber)
@@ -1823,7 +1823,7 @@ class FdfParser:
# @retval False Not able to find
#
def __GetRegionLayout(self, Fd):
- Offset = self.__CalcRegionExpr()
+ Offset = self.__CalcRegionExpr()
if Offset is None:
return False
@@ -2139,9 +2139,9 @@ class FdfParser:
while True:
self.__GetSetStatements(FvObj)
- if not (self.__GetBlockStatement(FvObj) or self.__GetFvBaseAddress(FvObj) or
- self.__GetFvForceRebase(FvObj) or self.__GetFvAlignment(FvObj) or
- self.__GetFvAttributes(FvObj) or self.__GetFvNameGuid(FvObj) or
+ if not (self.__GetBlockStatement(FvObj) or self.__GetFvBaseAddress(FvObj) or
+ self.__GetFvForceRebase(FvObj) or self.__GetFvAlignment(FvObj) or
+ self.__GetFvAttributes(FvObj) or self.__GetFvNameGuid(FvObj) or
self.__GetFvExtEntryStatement(FvObj) or self.__GetFvNameString(FvObj)):
break
@@ -2186,7 +2186,7 @@ class FdfParser:
raise Warning("Unknown alignment value '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
Obj.FvAlignment = self.__Token
return True
-
+
## __GetFvBaseAddress() method
#
# Get BaseAddress for FV
@@ -2210,8 +2210,8 @@ class FdfParser:
if not BaseAddrValuePattern.match(self.__Token.upper()):
raise Warning("Unknown FV base address value '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
Obj.FvBaseAddress = self.__Token
- return True
-
+ return True
+
## __GetFvForceRebase() method
#
# Get FvForceRebase for FV
@@ -2234,14 +2234,14 @@ class FdfParser:
if self.__Token.upper() not in ["TRUE", "FALSE", "0", "0X0", "0X00", "1", "0X1", "0X01"]:
raise Warning("Unknown FvForceRebase value '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
-
+
if self.__Token.upper() in ["TRUE", "1", "0X1", "0X01"]:
Obj.FvForceRebase = True
elif self.__Token.upper() in ["FALSE", "0", "0X0", "0X00"]:
Obj.FvForceRebase = False
else:
Obj.FvForceRebase = None
-
+
return True
@@ -2276,7 +2276,7 @@ class FdfParser:
FvObj.FvAttributeDict[name] = self.__Token
return IsWordToken
-
+
## __GetFvNameGuid() method
#
# Get FV GUID for FV
@@ -2322,7 +2322,7 @@ class FdfParser:
if not self.__IsKeyword ("TYPE"):
raise Warning("expected 'TYPE'", self.FileName, self.CurrentLineNumber)
-
+
if not self.__IsToken( "="):
raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
@@ -2343,7 +2343,7 @@ class FdfParser:
if not self.__IsToken( "="):
raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
-
+
if not self.__IsToken( "{"):
raise Warning("expected '{'", self.FileName, self.CurrentLineNumber)
@@ -2374,13 +2374,13 @@ class FdfParser:
FvObj.FvExtEntryData += [DataString]
if self.__Token == 'FILE':
-
+
if not self.__IsToken( "="):
raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
-
+
if not self.__GetNextToken():
raise Warning("expected FV Extension Entry file path At Line ", self.FileName, self.CurrentLineNumber)
-
+
FvObj.FvExtEntryData += [self.__Token]
if not self.__IsToken( "}"):
@@ -2543,7 +2543,7 @@ class FdfParser:
raise Warning("expected ARCH name", self.FileName, self.CurrentLineNumber)
FfsInfObj.UseArch = self.__Token
-
+
if self.__GetNextToken():
p = re.compile(r'([a-zA-Z0-9\-]+|\$\(TARGET\)|\*)_([a-zA-Z0-9\-]+|\$\(TOOL_CHAIN_TAG\)|\*)_([a-zA-Z0-9\-]+|\$\(ARCH\))')
if p.match(self.__Token) and p.match(self.__Token).span()[1] == len(self.__Token):
@@ -2584,7 +2584,7 @@ class FdfParser:
self.__UndoToken()
self.__UndoToken()
return False
-
+
FfsFileObj = FfsFileStatement.FileStatement()
FfsFileObj.FvFileType = self.__Token
@@ -2601,9 +2601,9 @@ class FdfParser:
if not self.__IsToken( ")"):
raise Warning("expected ')'", self.FileName, self.CurrentLineNumber)
self.__Token = 'PCD('+PcdPair[1]+'.'+PcdPair[0]+')'
-
+
FfsFileObj.NameGuid = self.__Token
-
+
self.__GetFilePart( FfsFileObj, MacroDict.copy())
if ForCapsule:
@@ -2879,7 +2879,7 @@ class FdfParser:
else:
VerSectionObj.FileName = self.__Token
Obj.SectionList.append(VerSectionObj)
-
+
elif self.__IsKeyword( "UI"):
if AlignValue == 'Auto':
raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
@@ -3333,7 +3333,7 @@ class FdfParser:
Value = self.__Token.strip()
else:
Value = self.__Token.strip()
- Obj.TokensDict[Name] = Value
+ Obj.TokensDict[Name] = Value
if not self.__GetNextToken():
return False
self.__UndoToken()
@@ -3475,7 +3475,7 @@ class FdfParser:
if not self.__GetNextToken():
raise Warning("expected File name", self.FileName, self.CurrentLineNumber)
-
+
AnyFileName = self.__Token
self.__VerifyFile(AnyFileName)
@@ -3508,7 +3508,7 @@ class FdfParser:
else:
CapsuleObj.CapsuleDataList.append(CapsuleAnyFile)
return True
-
+
## __GetAfileStatement() method
#
# Get Afile for capsule
@@ -3528,14 +3528,14 @@ class FdfParser:
if not self.__GetNextToken():
raise Warning("expected Afile name", self.FileName, self.CurrentLineNumber)
-
+
AfileName = self.__Token
AfileBaseName = os.path.basename(AfileName)
-
+
if os.path.splitext(AfileBaseName)[1] not in [".bin",".BIN",".Bin",".dat",".DAT",".Dat",".data",".DATA",".Data"]:
raise Warning('invalid binary file type, should be one of "bin","BIN","Bin","dat","DAT","Dat","data","DATA","Data"', \
self.FileName, self.CurrentLineNumber)
-
+
if not os.path.isabs(AfileName):
AfileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(AfileName)
self.__VerifyFile(AfileName)
@@ -3689,7 +3689,7 @@ class FdfParser:
if not self.__IsToken( ")"):
raise Warning("expected ')'", self.FileName, self.CurrentLineNumber)
self.__Token = 'PCD('+PcdPair[1]+'.'+PcdPair[0]+')'
-
+
NameGuid = self.__Token
KeepReloc = None
@@ -3951,11 +3951,11 @@ class FdfParser:
elif self.__GetNextToken():
if self.__Token not in ("}", "COMPAT16", "PE32", "PIC", "TE", "FV_IMAGE", "RAW", "DXE_DEPEX",\
"UI", "VERSION", "PEI_DEPEX", "GUID", "SMM_DEPEX"):
-
+
if self.__Token.startswith('PCD'):
self.__UndoToken()
self.__GetNextWord()
-
+
if self.__Token == 'PCD':
if not self.__IsToken( "("):
raise Warning("expected '('", self.FileName, self.CurrentLineNumber)
@@ -3963,9 +3963,9 @@ class FdfParser:
if not self.__IsToken( ")"):
raise Warning("expected ')'", self.FileName, self.CurrentLineNumber)
self.__Token = 'PCD('+PcdPair[1]+'.'+PcdPair[0]+')'
-
- EfiSectionObj.FileName = self.__Token
-
+
+ EfiSectionObj.FileName = self.__Token
+
else:
self.__UndoToken()
else:
@@ -4352,7 +4352,7 @@ class FdfParser:
self.SectionParser(S)
self.__UndoToken()
return False
-
+
self.__UndoToken()
if not self.__IsToken("[OptionRom.", True):
raise Warning("Unknown Keyword '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
@@ -4371,7 +4371,7 @@ class FdfParser:
isFile = self.__GetOptRomFileStatement(OptRomObj)
if not isInf and not isFile:
break
-
+
return True
## __GetOptRomInfStatement() method
@@ -4412,9 +4412,9 @@ class FdfParser:
else:
self.Profile.InfDict['ArchTBD'].append(ffsInf.InfFileName)
-
+
self.__GetOptRomOverrides (ffsInf)
-
+
Obj.FfsList.append(ffsInf)
return True
@@ -4476,7 +4476,7 @@ class FdfParser:
EdkLogger.error("FdfParser", FORMAT_INVALID, File=self.FileName, Line=self.CurrentLineNumber)
Obj.OverrideAttribs = Overrides
-
+
## __GetOptRomFileStatement() method
#
# Get FILE statements
@@ -4508,7 +4508,7 @@ class FdfParser:
if FfsFileObj.FileType == 'EFI':
self.__GetOptRomOverrides(FfsFileObj)
-
+
Obj.FfsList.append(FfsFileObj)
return True
@@ -4550,7 +4550,7 @@ class FdfParser:
if hasattr(CapsuleDataObj, 'FvName') and CapsuleDataObj.FvName is not None and CapsuleDataObj.FvName.upper() not in RefFvList:
RefFvList.append (CapsuleDataObj.FvName.upper())
elif hasattr(CapsuleDataObj, 'FdName') and CapsuleDataObj.FdName is not None and CapsuleDataObj.FdName.upper() not in RefFdList:
- RefFdList.append (CapsuleDataObj.FdName.upper())
+ RefFdList.append (CapsuleDataObj.FdName.upper())
elif CapsuleDataObj.Ffs is not None:
if isinstance(CapsuleDataObj.Ffs, FfsFileStatement.FileStatement):
if CapsuleDataObj.Ffs.FvName is not None and CapsuleDataObj.Ffs.FvName.upper() not in RefFvList:
@@ -4645,7 +4645,7 @@ class FdfParser:
RefFvStack = []
RefFvStack.append(FvName)
FdAnalyzedList = []
-
+
Index = 0
while RefFvStack != [] and Index < MaxLength:
Index = Index + 1
@@ -4698,7 +4698,7 @@ class FdfParser:
RefCapStack.append(CapName)
FdAnalyzedList = []
FvAnalyzedList = []
-
+
Index = 0
while RefCapStack != [] and Index < MaxLength:
Index = Index + 1
diff --git a/BaseTools/Source/Python/GenFds/Ffs.py b/BaseTools/Source/Python/GenFds/Ffs.py
index a4178121118b..f3a252ee1b9d 100644
--- a/BaseTools/Source/Python/GenFds/Ffs.py
+++ b/BaseTools/Source/Python/GenFds/Ffs.py
@@ -21,7 +21,7 @@ from CommonDataClass.FdfClass import FDClassObject
#
#
class Ffs(FDClassObject):
-
+
# mapping between MODULE type in FDF (from INF) and file type for GenFfs
ModuleTypeToFileType = {
'SEC' : 'EFI_FV_FILETYPE_SECURITY_CORE',
@@ -38,7 +38,7 @@ class Ffs(FDClassObject):
'MM_STANDALONE' : 'EFI_FV_FILETYPE_MM_STANDALONE',
'MM_CORE_STANDALONE' : 'EFI_FV_FILETYPE_MM_CORE_STANDALONE'
}
-
+
# mapping between FILE type in FDF and file type for GenFfs
FdfFvFileTypeToFileType = {
'SEC' : 'EFI_FV_FILETYPE_SECURITY_CORE',
@@ -56,7 +56,7 @@ class Ffs(FDClassObject):
'MM_STANDALONE' : 'EFI_FV_FILETYPE_MM_STANDALONE',
'MM_CORE_STANDALONE' : 'EFI_FV_FILETYPE_MM_CORE_STANDALONE'
}
-
+
# mapping between section type in FDF and file suffix
SectionSuffix = {
'PE32' : '.pe32',
@@ -68,14 +68,14 @@ class Ffs(FDClassObject):
'COMPAT16' : '.com16',
'RAW' : '.raw',
'FREEFORM_SUBTYPE_GUID': '.guid',
- 'SUBTYPE_GUID' : '.guid',
+ 'SUBTYPE_GUID' : '.guid',
'FV_IMAGE' : 'fv.sec',
'COMPRESS' : '.com',
'GUIDED' : '.guided',
'PEI_DEPEX' : '.dpx',
'SMM_DEPEX' : '.dpx'
}
-
+
## The constructor
#
# @param self The object pointer
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index ba8e0465ef34..cb83f6428c23 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -58,7 +58,7 @@ class FileStatement (FileStatementClassObject) :
# @retval string Generated FFS file name
#
def GenFfs(self, Dict = {}, FvChildAddr=[], FvParentAddr=None, IsMakefile=False, FvName=None):
-
+
if self.NameGuid is not None and self.NameGuid.startswith('PCD('):
PcdValue = GenFdsGlobalVariable.GetPcdValue(self.NameGuid)
if len(PcdValue) == 0:
@@ -71,7 +71,7 @@ class FileStatement (FileStatementClassObject) :
EdkLogger.error("GenFds", GENFDS_ERROR, 'GUID value for %s in wrong format.' \
% (self.NameGuid))
self.NameGuid = RegistryGuidStr
-
+
Str = self.NameGuid
if FvName:
Str += FvName
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index 3c5eef40222b..4cbc6bb9ba7f 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -225,7 +225,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
EdkLogger.warn("GenFds", GENFDS_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % (self.InfFileName))
if self.ModuleType == 'SMM_CORE' and int(self.PiSpecVersion, 16) < 0x0001000A:
- EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "SMM_CORE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x0001000A", File=self.InfFileName)
+ EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "SMM_CORE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x0001000A", File=self.InfFileName)
if self.ModuleType == 'MM_CORE_STANDALONE' and int(self.PiSpecVersion, 16) < 0x00010032:
EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "MM_CORE_STANDALONE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x00010032", File=self.InfFileName)
@@ -374,13 +374,13 @@ class FfsInfStatement(FfsInfStatementClassObject):
def PatchEfiFile(self, EfiFile, FileType):
#
# If the module does not have any patches, then return path to input file
- #
+ #
if not self.PatchPcds:
return EfiFile
#
# Only patch file if FileType is PE32 or ModuleType is USER_DEFINED
- #
+ #
if FileType != 'PE32' and self.ModuleType != "USER_DEFINED":
return EfiFile
@@ -398,7 +398,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
#
# If a different file from the same module has already been patched, then generate an error
- #
+ #
if self.PatchedBinFile:
EdkLogger.error("GenFds", GENFDS_ERROR,
'Only one binary file can be patched:\n'
@@ -408,12 +408,12 @@ class FfsInfStatement(FfsInfStatementClassObject):
#
# Copy unpatched file contents to output file location to perform patching
- #
+ #
CopyLongFilePath(EfiFile, Output)
#
# Apply patches to patched output file
- #
+ #
for Pcd, Value in self.PatchPcds:
RetVal, RetStr = PatchBinaryFile(Output, int(Pcd.Offset, 0), Pcd.DatumType, Value, Pcd.MaxDatumSize)
if RetVal:
@@ -421,12 +421,12 @@ class FfsInfStatement(FfsInfStatementClassObject):
#
# Save the path of the patched output file
- #
+ #
self.PatchedBinFile = Output
#
# Return path to patched output file
- #
+ #
return Output
## GenFfs() method
@@ -448,14 +448,14 @@ class FfsInfStatement(FfsInfStatementClassObject):
Arch = self.GetCurrentArch()
SrcFile = mws.join( GenFdsGlobalVariable.WorkSpaceDir , self.InfFileName);
DestFile = os.path.join( self.OutputPath, self.ModuleGuid + '.ffs')
-
+
SrcFileDir = "."
SrcPath = os.path.dirname(SrcFile)
SrcFileName = os.path.basename(SrcFile)
- SrcFileBase, SrcFileExt = os.path.splitext(SrcFileName)
+ SrcFileBase, SrcFileExt = os.path.splitext(SrcFileName)
DestPath = os.path.dirname(DestFile)
DestFileName = os.path.basename(DestFile)
- DestFileBase, DestFileExt = os.path.splitext(DestFileName)
+ DestFileBase, DestFileExt = os.path.splitext(DestFileName)
self.MacroDict = {
# source file
"${src}" : SrcFile,
@@ -473,7 +473,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
}
#
# Allow binary type module not specify override rule in FDF file.
- #
+ #
if len(self.BinFileList) > 0:
if self.Rule is None or self.Rule == "":
self.Rule = "BINARY"
@@ -534,7 +534,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
'$(NAMED_GUID)' : self.ModuleGuid
}
String = GenFdsGlobalVariable.MacroExtend(String, MacroDict)
- String = GenFdsGlobalVariable.MacroExtend(String, self.MacroDict)
+ String = GenFdsGlobalVariable.MacroExtend(String, self.MacroDict)
return String
## __GetRule__() method
@@ -960,14 +960,14 @@ class FfsInfStatement(FfsInfStatementClassObject):
Sect.FvAddr = FvChildAddr
if FvParentAddr is not None and isinstance(Sect, GuidSection):
Sect.FvParentAddr = FvParentAddr
-
+
if Rule.KeyStringList != []:
SectList, Align = Sect.GenSection(self.OutputPath , self.ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile = IsMakefile)
else :
SectList, Align = Sect.GenSection(self.OutputPath , self.ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile = IsMakefile)
-
+
if not HasGeneratedFlag:
- UniVfrOffsetFileSection = ""
+ UniVfrOffsetFileSection = ""
ModuleFileName = mws.join(GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName)
InfData = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClass(ModuleFileName), self.CurrentArch]
#
@@ -978,16 +978,16 @@ class FfsInfStatement(FfsInfStatementClassObject):
for SourceFile in InfData.Sources:
if SourceFile.Type.upper() == ".VFR" :
#
- # search the .map file to find the offset of vfr binary in the PE32+/TE file.
+ # search the .map file to find the offset of vfr binary in the PE32+/TE file.
#
VfrUniBaseName[SourceFile.BaseName] = (SourceFile.BaseName + "Bin")
if SourceFile.Type.upper() == ".UNI" :
#
- # search the .map file to find the offset of Uni strings binary in the PE32+/TE file.
+ # search the .map file to find the offset of Uni strings binary in the PE32+/TE file.
#
VfrUniBaseName["UniOffsetName"] = (self.BaseName + "Strings")
-
-
+
+
if len(VfrUniBaseName) > 0:
if IsMakefile:
if InfData.BuildType != 'UEFI_HII':
@@ -1023,7 +1023,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
if UniVfrOffsetFileSection:
SectList.append(UniVfrOffsetFileSection)
HasGeneratedFlag = True
-
+
for SecName in SectList :
SectFiles.append(SecName)
SectAlignments.append(Align)
@@ -1071,12 +1071,12 @@ class FfsInfStatement(FfsInfStatementClassObject):
# @param self The object pointer
# @param VfrUniBaseName A name list contain the UNI/INF object name.
# @retval RetValue A list contain offset of UNI/INF object.
- #
+ #
def __GetBuildOutputMapFileVfrUniInfo(self, VfrUniBaseName):
MapFileName = os.path.join(self.EfiOutputPath, self.BaseName + ".map")
EfiFileName = os.path.join(self.EfiOutputPath, self.BaseName + ".efi")
return GetVariableOffset(MapFileName, EfiFileName, VfrUniBaseName.values())
-
+
## __GenUniVfrOffsetFile() method
#
# Generate the offset file for the module which contain VFR or UNI file.
@@ -1088,8 +1088,8 @@ class FfsInfStatement(FfsInfStatementClassObject):
def __GenUniVfrOffsetFile(VfrUniOffsetList, UniVfrOffsetFileName):
# Use a instance of StringIO to cache data
- fStringIO = StringIO.StringIO('')
-
+ fStringIO = StringIO.StringIO('')
+
for Item in VfrUniOffsetList:
if (Item[0].find("Strings") != -1):
#
@@ -1099,7 +1099,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
#
UniGuid = [0xe0, 0xc5, 0x13, 0x89, 0xf6, 0x33, 0x86, 0x4d, 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66]
UniGuid = [chr(ItemGuid) for ItemGuid in UniGuid]
- fStringIO.write(''.join(UniGuid))
+ fStringIO.write(''.join(UniGuid))
UniValue = pack ('Q', int (Item[1], 16))
fStringIO.write (UniValue)
else:
@@ -1110,11 +1110,11 @@ class FfsInfStatement(FfsInfStatementClassObject):
#
VfrGuid = [0xb4, 0x7c, 0xbc, 0xd0, 0x47, 0x6a, 0x5f, 0x49, 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2]
VfrGuid = [chr(ItemGuid) for ItemGuid in VfrGuid]
- fStringIO.write(''.join(VfrGuid))
- type (Item[1])
+ fStringIO.write(''.join(VfrGuid))
+ type (Item[1])
VfrValue = pack ('Q', int (Item[1], 16))
fStringIO.write (VfrValue)
-
+
#
# write data into file.
#
@@ -1122,7 +1122,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
SaveFileOnChange(UniVfrOffsetFileName, fStringIO.getvalue())
except:
EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %UniVfrOffsetFileName,None)
-
+
fStringIO.close ()
-
+
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 0fb2bd456a6d..2e57c5e92365 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -53,7 +53,7 @@ class FV (FvClassObject):
self.FvForceRebase = None
self.FvRegionInFD = None
self.UsedSizeEnable = False
-
+
## AddToBuffer()
#
# Generate Fv and add it to the Buffer
@@ -72,7 +72,7 @@ class FV (FvClassObject):
if BaseAddress is None and self.UiFvName.upper() + 'fv' in GenFds.ImageBinDict:
return GenFds.ImageBinDict[self.UiFvName.upper() + 'fv']
-
+
#
# Check whether FV in Capsule is in FD flash region.
# If yes, return error. Doesn't support FV in Capsule image is also in FD flash region.
@@ -92,7 +92,7 @@ class FV (FvClassObject):
GenFdsGlobalVariable.InfLogger( "\nGenerating %s FV" %self.UiFvName)
GenFdsGlobalVariable.LargeFileInFvFlags.append(False)
FFSGuid = None
-
+
if self.FvBaseAddress is not None:
BaseAddress = self.FvBaseAddress
if not Flag:
@@ -289,7 +289,7 @@ class FV (FvClassObject):
if not self._GetBlockSize():
#set default block size is 1
self.FvInfFile.writelines("EFI_BLOCK_SIZE = 0x1" + T_CHAR_LF)
-
+
for BlockSize in self.BlockSizeList :
if BlockSize[0] is not None:
self.FvInfFile.writelines("EFI_BLOCK_SIZE = " + \
@@ -331,7 +331,7 @@ class FV (FvClassObject):
self.FvAlignment.strip() + \
" = TRUE" + \
T_CHAR_LF)
-
+
#
# Generate FV extension header file
#
@@ -390,7 +390,7 @@ class FV (FvClassObject):
TotalSize += (Size + 4)
FvExtFile.seek(0)
Buffer += pack('HH', (Size + 4), int(self.FvExtEntryTypeValue[Index], 16))
- Buffer += FvExtFile.read()
+ Buffer += FvExtFile.read()
FvExtFile.close()
if self.FvExtEntryType[Index] == 'DATA':
ByteList = self.FvExtEntryData[Index].split(',')
@@ -403,12 +403,12 @@ class FV (FvClassObject):
Buffer += pack('B', int(ByteList[Index1], 16))
Guid = self.FvNameGuid.split('-')
- Buffer = pack('=LHHBBBBBBBBL',
- int(Guid[0], 16),
- int(Guid[1], 16),
- int(Guid[2], 16),
- int(Guid[3][-4:-2], 16),
- int(Guid[3][-2:], 16),
+ Buffer = pack('=LHHBBBBBBBBL',
+ int(Guid[0], 16),
+ int(Guid[1], 16),
+ int(Guid[2], 16),
+ int(Guid[3][-4:-2], 16),
+ int(Guid[3][-2:], 16),
int(Guid[4][-12:-10], 16),
int(Guid[4][-10:-8], 16),
int(Guid[4][-8:-6], 16),
@@ -434,7 +434,7 @@ class FV (FvClassObject):
FvExtHeaderFileName + \
T_CHAR_LF)
-
+
#
# Add [Files]
#
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 74017e72629b..4b8c7913d2db 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -71,10 +71,10 @@ def main():
if Options.verbose is not None:
EdkLogger.SetLevel(EdkLogger.VERBOSE)
GenFdsGlobalVariable.VerboseMode = True
-
+
if Options.FixedAddress is not None:
GenFdsGlobalVariable.FixedLoadAddress = True
-
+
if Options.quiet is not None:
EdkLogger.SetLevel(EdkLogger.QUIET)
if Options.debug is not None:
@@ -99,7 +99,7 @@ def main():
if Options.GenfdsMultiThread:
GenFdsGlobalVariable.EnableGenfdsMultiThread = True
os.chdir(GenFdsGlobalVariable.WorkSpaceDir)
-
+
# set multiple workspace
PackagesPath = os.getenv("PACKAGES_PATH")
mws.setWs(GenFdsGlobalVariable.WorkSpaceDir, PackagesPath)
@@ -227,7 +227,7 @@ def main():
GlobalData.gDatabasePath = os.path.normpath(os.path.join(ConfDirectoryPath, GlobalData.gDatabasePath))
BuildWorkSpace = WorkspaceDatabase(GlobalData.gDatabasePath)
BuildWorkSpace.InitDatabase()
-
+
#
# Get files real name in workspace dir
#
@@ -243,7 +243,7 @@ def main():
TargetArchList = set(BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, TAB_COMMON, Options.BuildTarget, Options.ToolChain].SupArchList) & set(ArchList)
if len(TargetArchList) == 0:
EdkLogger.error("GenFds", GENFDS_ERROR, "Target ARCH %s not in platform supported ARCH %s" % (str(ArchList), str(BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, TAB_COMMON].SupArchList)))
-
+
for Arch in ArchList:
GenFdsGlobalVariable.OutputDirFromDscDict[Arch] = NormPath(BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, Options.BuildTarget, Options.ToolChain].OutputDirectory)
GenFdsGlobalVariable.PlatformName = BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, Options.BuildTarget, Options.ToolChain].PlatformName
@@ -550,7 +550,7 @@ class GenFds :
Buffer = StringIO.StringIO('')
FvObj.AddToBuffer(Buffer)
Buffer.close()
-
+
if GenFds.OnlyGenerateThisFv is None and GenFds.OnlyGenerateThisFd is None and GenFds.OnlyGenerateThisCap is None:
if GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict != {}:
GenFdsGlobalVariable.VerboseLogger("\n Generate other Capsule images!")
@@ -616,7 +616,7 @@ class GenFds :
# @retval None
#
def DisplayFvSpaceInfo(FdfParser):
-
+
FvSpaceInfoList = []
MaxFvNameLength = 0
for FvName in FdfParser.Profile.FvDict:
@@ -643,10 +643,10 @@ class GenFds :
if NameValue[0].strip() == 'EFI_FV_SPACE_SIZE':
FreeFound = True
Free = NameValue[1].strip()
-
+
if TotalFound and UsedFound and FreeFound:
FvSpaceInfoList.append((FvName, Total, Used, Free))
-
+
GenFdsGlobalVariable.InfLogger('\nFV Space Information')
for FvSpaceInfo in FvSpaceInfoList:
Name = FvSpaceInfo[0]
@@ -674,18 +674,18 @@ class GenFds :
if PcdObj.TokenCName == 'PcdBsBaseAddress':
PcdValue = PcdObj.DefaultValue
break
-
+
if PcdValue == '':
return
-
+
Int64PcdValue = long(PcdValue, 0)
- if Int64PcdValue == 0 or Int64PcdValue < -1:
+ if Int64PcdValue == 0 or Int64PcdValue < -1:
return
-
+
TopAddress = 0
if Int64PcdValue > 0:
TopAddress = Int64PcdValue
-
+
ModuleDict = BuildDb.BuildObject[DscFile, TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].Modules
for Key in ModuleDict:
ModuleObj = BuildDb.BuildObject[Key, TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 6745a89514b7..fac9fee0bea6 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -64,7 +64,7 @@ class GenFdsGlobalVariable:
FdfFileTimeStamp = 0
FixedLoadAddress = False
PlatformName = ''
-
+
BuildRuleFamily = "MSFT"
ToolChainFamily = "MSFT"
__BuildRuleDatabase = None
@@ -74,7 +74,7 @@ class GenFdsGlobalVariable:
CopyList = []
ModuleFile = ''
EnableGenfdsMultiThread = False
-
+
#
# The list whose element are flags to indicate if large FFS or SECTION files exist in FV.
# At the beginning of each generation of FV, false flag is appended to the list,
@@ -89,7 +89,7 @@ class GenFdsGlobalVariable:
LARGE_FILE_SIZE = 0x1000000
SectionHeader = struct.Struct("3B 1B")
-
+
## LoadBuildRule
#
@staticmethod
@@ -116,7 +116,7 @@ class GenFdsGlobalVariable:
and GenFdsGlobalVariable.ToolChainTag in ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY] \
and ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY][GenFdsGlobalVariable.ToolChainTag]:
GenFdsGlobalVariable.BuildRuleFamily = ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY][GenFdsGlobalVariable.ToolChainTag]
-
+
if DataType.TAB_TOD_DEFINES_FAMILY in ToolDefinition \
and GenFdsGlobalVariable.ToolChainTag in ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY] \
and ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY][GenFdsGlobalVariable.ToolChainTag]:
@@ -228,11 +228,11 @@ class GenFdsGlobalVariable:
while Index < len(SourceList):
Source = SourceList[Index]
Index = Index + 1
-
+
if File.IsBinary and File == Source and Inf.Binaries is not None and File in Inf.Binaries:
# Skip all files that are not binary libraries
if not Inf.LibraryClass:
- continue
+ continue
RuleObject = BuildRules[DataType.TAB_DEFAULT_BINARY_FILE]
elif FileType in BuildRules:
RuleObject = BuildRules[FileType]
@@ -243,15 +243,15 @@ class GenFdsGlobalVariable:
if LastTarget:
TargetList.add(str(LastTarget))
break
-
+
FileType = RuleObject.SourceFileType
-
+
# stop at STATIC_LIBRARY for library
if Inf.LibraryClass and FileType == DataType.TAB_STATIC_LIBRARY:
if LastTarget:
TargetList.add(str(LastTarget))
break
-
+
Target = RuleObject.Apply(Source)
if not Target:
if LastTarget:
@@ -260,11 +260,11 @@ class GenFdsGlobalVariable:
elif not Target.Outputs:
# Only do build for target with outputs
TargetList.add(str(Target))
-
+
# to avoid cyclic rule
if FileType in RuleChain:
break
-
+
RuleChain.append(FileType)
SourceList.extend(Target.Outputs)
LastTarget = Target
@@ -647,19 +647,19 @@ class GenFdsGlobalVariable:
@staticmethod
def GenerateOptionRom(Output, EfiInput, BinaryInput, Compress=False, ClassCode=None,
Revision=None, DeviceId=None, VendorId=None, IsMakefile=False):
- InputList = []
+ InputList = []
Cmd = ["EfiRom"]
if len(EfiInput) > 0:
-
+
if Compress:
Cmd += ["-ec"]
else:
Cmd += ["-e"]
-
+
for EfiFile in EfiInput:
Cmd += [EfiFile]
InputList.append (EfiFile)
-
+
if len(BinaryInput) > 0:
Cmd += ["-b"]
for BinFile in BinaryInput:
@@ -670,7 +670,7 @@ class GenFdsGlobalVariable:
if not GenFdsGlobalVariable.NeedsUpdate(Output, InputList) and not IsMakefile:
return
GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, InputList))
-
+
if ClassCode is not None:
Cmd += ["-l", ClassCode]
if Revision is not None:
@@ -813,7 +813,7 @@ class GenFdsGlobalVariable:
EdkLogger.error("GenFds", GENFDS_ERROR, "%s is not FixedAtBuild type." % PcdPattern)
if PcdObj.DatumType != DataType.TAB_VOID:
EdkLogger.error("GenFds", GENFDS_ERROR, "%s is not VOID* datum type." % PcdPattern)
-
+
PcdValue = PcdObj.DefaultValue
return PcdValue
@@ -829,7 +829,7 @@ class GenFdsGlobalVariable:
EdkLogger.error("GenFds", GENFDS_ERROR, "%s is not FixedAtBuild type." % PcdPattern)
if PcdObj.DatumType != DataType.TAB_VOID:
EdkLogger.error("GenFds", GENFDS_ERROR, "%s is not VOID* datum type." % PcdPattern)
-
+
PcdValue = PcdObj.DefaultValue
return PcdValue
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index 08665a3d4d49..eebdafd823bb 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -74,7 +74,7 @@ class GuidSection(GuidSectionClassObject) :
FvAddrIsSet = True
else:
FvAddrIsSet = False
-
+
if self.ProcessRequired in ("TRUE", "1"):
if self.FvAddr != []:
#no use FvAddr when the image is processed.
diff --git a/BaseTools/Source/Python/GenFds/OptRomFileStatement.py b/BaseTools/Source/Python/GenFds/OptRomFileStatement.py
index 4ef9b4d0e9a8..744c2b0422d9 100644
--- a/BaseTools/Source/Python/GenFds/OptRomFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomFileStatement.py
@@ -18,7 +18,7 @@
import Common.LongFilePathOs as os
from GenFdsGlobalVariable import GenFdsGlobalVariable
-##
+##
#
#
class OptRomFileStatement:
@@ -40,10 +40,10 @@ class OptRomFileStatement:
# @retval string Generated FFS file name
#
def GenFfs(self, Dict = {}, IsMakefile=False):
-
+
if self.FileName is not None:
self.FileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FileName)
-
+
return self.FileName
diff --git a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
index 62d731fb9cca..1ef82f7106a1 100644
--- a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
@@ -26,7 +26,7 @@ from Common.String import *
from FfsInfStatement import FfsInfStatement
from GenFdsGlobalVariable import GenFdsGlobalVariable
-##
+##
#
#
class OptRomInfStatement (FfsInfStatement):
@@ -45,7 +45,7 @@ class OptRomInfStatement (FfsInfStatement):
# @param self The object pointer
#
def __GetOptRomParams(self):
-
+
if self.OverrideAttribs is None:
self.OverrideAttribs = OptionRom.OverrideAttribs()
@@ -59,21 +59,21 @@ class OptRomInfStatement (FfsInfStatement):
if self.OverrideAttribs.PciVendorId is None:
self.OverrideAttribs.PciVendorId = self.OptRomDefs.get ('PCI_VENDOR_ID')
-
+
if self.OverrideAttribs.PciClassCode is None:
self.OverrideAttribs.PciClassCode = self.OptRomDefs.get ('PCI_CLASS_CODE')
-
+
if self.OverrideAttribs.PciDeviceId is None:
self.OverrideAttribs.PciDeviceId = self.OptRomDefs.get ('PCI_DEVICE_ID')
-
+
if self.OverrideAttribs.PciRevision is None:
self.OverrideAttribs.PciRevision = self.OptRomDefs.get ('PCI_REVISION')
-
-# InfObj = GenFdsGlobalVariable.WorkSpace.BuildObject[self.PathClassObj, self.CurrentArch]
+
+# InfObj = GenFdsGlobalVariable.WorkSpace.BuildObject[self.PathClassObj, self.CurrentArch]
# RecordList = InfObj._RawData[MODEL_META_DATA_HEADER, InfObj._Arch, InfObj._Platform]
# for Record in RecordList:
# Record = ReplaceMacros(Record, GlobalData.gEdkGlobal, False)
-# Name = Record[0]
+# Name = Record[0]
## GenFfs() method
#
# Generate FFS
@@ -148,8 +148,7 @@ class OptRomInfStatement (FfsInfStatement):
OutputFileList.append(GenSecInputFile)
else:
FileList, IsSect = Section.Section.GetFileList(self, '', Sect.FileExtension)
- OutputFileList.extend(FileList)
-
+ OutputFileList.extend(FileList)
+
return OutputFileList
-
\ No newline at end of file
diff --git a/BaseTools/Source/Python/GenFds/OptionRom.py b/BaseTools/Source/Python/GenFds/OptionRom.py
index b05841529940..7373a6a2f0bf 100644
--- a/BaseTools/Source/Python/GenFds/OptionRom.py
+++ b/BaseTools/Source/Python/GenFds/OptionRom.py
@@ -29,7 +29,7 @@ from Common.BuildToolError import *
T_CHAR_LF = '\n'
-##
+##
#
#
class OPTIONROM (OptionRomClassObject):
@@ -58,7 +58,7 @@ class OPTIONROM (OptionRomClassObject):
# Process Modules in FfsList
for FfsFile in self.FfsList :
-
+
if isinstance(FfsFile, OptRomInfStatement.OptRomInfStatement):
FilePathNameList = FfsFile.GenFfs(IsMakefile=Flag)
if len(FilePathNameList) == 0:
@@ -71,14 +71,14 @@ class OPTIONROM (OptionRomClassObject):
if not os.path.exists(TmpOutputDir) :
os.makedirs(TmpOutputDir)
TmpOutputFile = os.path.join(TmpOutputDir, FileName+'.tmp')
-
- GenFdsGlobalVariable.GenerateOptionRom(TmpOutputFile,
- FilePathNameList,
- [],
- FfsFile.OverrideAttribs.NeedCompress,
- FfsFile.OverrideAttribs.PciClassCode,
- FfsFile.OverrideAttribs.PciRevision,
- FfsFile.OverrideAttribs.PciDeviceId,
+
+ GenFdsGlobalVariable.GenerateOptionRom(TmpOutputFile,
+ FilePathNameList,
+ [],
+ FfsFile.OverrideAttribs.NeedCompress,
+ FfsFile.OverrideAttribs.PciClassCode,
+ FfsFile.OverrideAttribs.PciRevision,
+ FfsFile.OverrideAttribs.PciDeviceId,
FfsFile.OverrideAttribs.PciVendorId,
IsMakefile = Flag)
BinFileList.append(TmpOutputFile)
@@ -90,14 +90,14 @@ class OPTIONROM (OptionRomClassObject):
if not os.path.exists(TmpOutputDir) :
os.makedirs(TmpOutputDir)
TmpOutputFile = os.path.join(TmpOutputDir, FileName+'.tmp')
-
- GenFdsGlobalVariable.GenerateOptionRom(TmpOutputFile,
- [FilePathName],
- [],
- FfsFile.OverrideAttribs.NeedCompress,
- FfsFile.OverrideAttribs.PciClassCode,
- FfsFile.OverrideAttribs.PciRevision,
- FfsFile.OverrideAttribs.PciDeviceId,
+
+ GenFdsGlobalVariable.GenerateOptionRom(TmpOutputFile,
+ [FilePathName],
+ [],
+ FfsFile.OverrideAttribs.NeedCompress,
+ FfsFile.OverrideAttribs.PciClassCode,
+ FfsFile.OverrideAttribs.PciRevision,
+ FfsFile.OverrideAttribs.PciDeviceId,
FfsFile.OverrideAttribs.PciVendorId,
IsMakefile=Flag)
BinFileList.append(TmpOutputFile)
@@ -106,13 +106,13 @@ class OPTIONROM (OptionRomClassObject):
EfiFileList.append(FilePathName)
else:
BinFileList.append(FilePathName)
-
+
#
# Call EfiRom tool
#
OutputFile = os.path.join(GenFdsGlobalVariable.FvDir, self.DriverName)
OutputFile = OutputFile + '.rom'
-
+
GenFdsGlobalVariable.GenerateOptionRom(
OutputFile,
EfiFileList,
@@ -122,21 +122,20 @@ class OPTIONROM (OptionRomClassObject):
if not Flag:
GenFdsGlobalVariable.InfLogger( "\nGenerate %s Option ROM Successfully" %self.DriverName)
GenFdsGlobalVariable.SharpCounter = 0
-
+
return OutputFile
class OverrideAttribs:
-
+
## The constructor
#
# @param self The object pointer
#
def __init__(self):
-
+
self.PciVendorId = None
self.PciClassCode = None
self.PciDeviceId = None
self.PciRevision = None
self.NeedCompress = None
-
-
\ No newline at end of file
+
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index 44f122a71216..1042281fe314 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -309,7 +309,7 @@ class Region(RegionClassObject):
if self.Offset >= End:
Start = End
continue
- # region located in current blocks
+ # region located in current blocks
else:
# region ended within current blocks
if self.Offset + self.Size <= End:
@@ -361,5 +361,5 @@ class Region(RegionClassObject):
else:
Index += 1
-
+
diff --git a/BaseTools/Source/Python/GenFds/Section.py b/BaseTools/Source/Python/GenFds/Section.py
index 4b368b3ada9d..ce10c35dc880 100644
--- a/BaseTools/Source/Python/GenFds/Section.py
+++ b/BaseTools/Source/Python/GenFds/Section.py
@@ -160,7 +160,7 @@ class Section (SectionClassObject):
SuffixMap = FfsInf.GetFinalTargetSuffixMap()
if Suffix in SuffixMap:
FileList.extend(SuffixMap[Suffix])
-
+
#Process the file lists is alphabetical for a same section type
if len (FileList) > 1:
FileList.sort()
diff --git a/BaseTools/Source/Python/GenFds/Vtf.py b/BaseTools/Source/Python/GenFds/Vtf.py
index 18ea37b9afdd..beb3200f23e6 100644
--- a/BaseTools/Source/Python/GenFds/Vtf.py
+++ b/BaseTools/Source/Python/GenFds/Vtf.py
@@ -25,7 +25,7 @@ T_CHAR_LF = '\n'
#
#
class Vtf (VtfClassObject):
-
+
## The constructor
#
# @param self The object pointer
@@ -46,7 +46,7 @@ class Vtf (VtfClassObject):
OutputFile = os.path.join(GenFdsGlobalVariable.FvDir, self.UiName + '.Vtf')
BaseAddArg = self.GetBaseAddressArg(FdAddressDict)
OutputArg, VtfRawDict = self.GenOutputArg()
-
+
Cmd = (
'GenVtf',
) + OutputArg + (
@@ -55,9 +55,9 @@ class Vtf (VtfClassObject):
GenFdsGlobalVariable.CallExternalTool(Cmd, "GenFv -Vtf Failed!")
GenFdsGlobalVariable.SharpCounter = 0
-
+
return VtfRawDict
-
+
## GenBsfInf() method
#
# Generate inf used to generate VTF
@@ -154,7 +154,7 @@ class Vtf (VtfClassObject):
for component in self.ComponentStatementList :
if component.CompLoc.upper() != 'NONE' and not (component.CompLoc.upper() in FvList):
FvList.append(component.CompLoc.upper())
-
+
return FvList
## GetBaseAddressArg() method
@@ -173,13 +173,13 @@ class Vtf (VtfClassObject):
'-s', '0x%x' % Size,
)
return CmdStr
-
+
## GenOutputArg() method
#
# Get output arguments for GenVtf
#
# @param self The object pointer
- #
+ #
def GenOutputArg(self):
FvVtfDict = {}
OutputFileName = ''
@@ -192,6 +192,6 @@ class Vtf (VtfClassObject):
OutputFileName = os.path.join(GenFdsGlobalVariable.FvDir, OutputFileName)
Arg += ('-o', OutputFileName)
FvVtfDict[FvObj.upper()] = OutputFileName
-
+
return Arg, FvVtfDict
-
+
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index ebd6a306390b..bf01de35a6e6 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -1,7 +1,7 @@
## @file
# Generate PCD table for 'Patchable In Module' type PCD with given .map file.
# The Patch PCD table like:
-#
+#
# PCD Name Offset in binary
# ======== ================
#
@@ -39,9 +39,9 @@ __copyright__ = "Copyright (c) 2008 - 2010, Intel Corporation. All rights reserv
symRe = re.compile('^([\da-fA-F]+):([\da-fA-F]+) +([\.\-:\\\\\w\?@\$<>]+) +([\da-fA-F]+)', re.UNICODE)
def parsePcdInfoFromMapFile(mapfilepath, efifilepath):
- """ Parse map file to get binary patch pcd information
+ """ Parse map file to get binary patch pcd information
@param path Map file absolution path
-
+
@return a list which element hold (PcdName, Offset, SectionName)
"""
lines = []
@@ -51,7 +51,7 @@ def parsePcdInfoFromMapFile(mapfilepath, efifilepath):
f.close()
except:
return None
-
+
if len(lines) == 0: return None
firstline = lines[0].strip()
if (firstline.startswith("Archive member included ") and
@@ -110,7 +110,7 @@ def _parseForGCC(lines, efifilepath):
m = pcdPatternGcc.match(lines[index + 1].strip())
if m is not None:
bpcds.append((PcdName, int(m.groups(0)[0], 16) , int(sections[-1][1], 16), sections[-1][0]))
-
+
# get section information from efi file
efisecs = PeImageClass(efifilepath).SectionHeaderList
if efisecs is None or len(efisecs) == 0:
@@ -128,11 +128,11 @@ def _parseForGCC(lines, efifilepath):
#assert efisec[0].strip() == pcd[3].strip() and efisec[1] + redirection == pcd[2], "There are some differences between map file and efi file"
pcds.append([pcd[0], efisec[2] + pcd[1] - efisec[1] - redirection, efisec[0]])
return pcds
-
+
def _parseGeneral(lines, efifilepath):
- """ For MSFT, ICC, EBC
+ """ For MSFT, ICC, EBC
@param lines line array for map file
-
+
@return a list which element hold (PcdName, Offset, SectionName)
"""
status = 0 #0 - beginning of file; 1 - PE section definition; 2 - symbol table
@@ -176,7 +176,7 @@ def _parseGeneral(lines, efifilepath):
efisecs = PeImageClass(efifilepath).SectionHeaderList
if efisecs is None or len(efisecs) == 0:
return None
-
+
pcds = []
for pcd in bPcds:
index = 0
@@ -187,7 +187,7 @@ def _parseGeneral(lines, efifilepath):
elif pcd[4] == index:
pcds.append([pcd[0], efisec[2] + pcd[2], efisec[0]])
return pcds
-
+
def generatePcdTable(list, pcdpath):
try:
f = open(pcdpath, 'w')
@@ -195,12 +195,12 @@ def generatePcdTable(list, pcdpath):
pass
f.write('PCD Name Offset Section Name\r\n')
-
+
for pcditem in list:
f.write('%-30s 0x%-08X %-6s\r\n' % (pcditem[0], pcditem[1], pcditem[2]))
f.close()
- #print 'Success to generate Binary Patch PCD table at %s!' % pcdpath
+ #print 'Success to generate Binary Patch PCD table at %s!' % pcdpath
if __name__ == '__main__':
UsageString = "%prog -m <MapFile> -e <EfiFile> -o <OutFile>"
@@ -212,7 +212,7 @@ if __name__ == '__main__':
help='Absolute path of EFI binary file.')
parser.add_option('-o', '--outputfile', action='store', dest='outfile',
help='Absolute path of output file to store the got patchable PCD table.')
-
+
(options, args) = parser.parse_args()
if options.mapfile is None or options.efifile is None:
diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
index cf2fc7c4f70a..1feec1313f95 100644
--- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
+++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
@@ -35,9 +35,9 @@ __copyright__ = "Copyright (c) 2010, Intel Corporation. All rights reserved."
## PatchBinaryFile method
#
# This method mainly patches the data into binary file.
-#
+#
# @param FileName File path of the binary file
-# @param ValueOffset Offset value
+# @param ValueOffset Offset value
# @param TypeName DataType Name
# @param Value Value String
# @param MaxSize MaxSize value
@@ -173,7 +173,7 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
return PARAMETER_INVALID, "PCD Value %s is not valid dec or hex string array." % (ValueString)
else:
#
- # Patch ascii string
+ # Patch ascii string
#
Index = 0
for ByteString in ValueString[1:-1]:
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 9711de8f5c2e..6e28b351488d 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -1,11 +1,11 @@
## @file
-# This tool can be used to generate new RSA 2048 bit private/public key pairs
-# in a PEM file format using OpenSSL command line utilities that are installed
+# This tool can be used to generate new RSA 2048 bit private/public key pairs
+# in a PEM file format using OpenSSL command line utilities that are installed
# on the path specified by the system environment variable OPENSSL_PATH.
-# This tool can also optionally write one or more SHA 256 hashes of 2048 bit
-# public keys to a binary file, write one or more SHA 256 hashes of 2048 bit
-# public keys to a file in a C structure format, and in verbose mode display
-# one or more SHA 256 hashes of 2048 bit public keys in a C structure format
+# This tool can also optionally write one or more SHA 256 hashes of 2048 bit
+# public keys to a binary file, write one or more SHA 256 hashes of 2048 bit
+# public keys to a file in a C structure format, and in verbose mode display
+# one or more SHA 256 hashes of 2048 bit public keys in a C structure format
# on STDOUT.
# This tool has been tested with OpenSSL 1.0.1e 11 Feb 2013
#
@@ -25,7 +25,7 @@ Rsa2048Sha256GenerateKeys
import os
import sys
-import argparse
+import argparse
import subprocess
from Common.BuildVersion import gBUILD_VERSION
@@ -41,7 +41,7 @@ __usage__ = '%s [options]' % (__prog__)
if __name__ == '__main__':
#
# Create command line argument parser object
- #
+ #
parser = argparse.ArgumentParser(prog=__prog__, version=__version__, usage=__usage__, description=__copyright__, conflict_handler='resolve')
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument("-o", "--output", dest='OutputFile', type=argparse.FileType('wb'), metavar='filename', nargs='*', help="specify the output private key filename in PEM format")
@@ -54,7 +54,7 @@ if __name__ == '__main__':
#
# Parse command line arguments
- #
+ #
args = parser.parse_args()
#
@@ -74,18 +74,18 @@ if __name__ == '__main__':
#
try:
Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
- except:
+ except:
print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
sys.exit(1)
-
+
Version = Process.communicate()
if Process.returncode <> 0:
print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
sys.exit(Process.returncode)
print Version[0]
-
+
args.PemFileName = []
-
+
#
# Check for output file argument
#
@@ -105,7 +105,7 @@ if __name__ == '__main__':
if Process.returncode <> 0:
print 'ERROR: RSA 2048 key generation failed'
sys.exit(Process.returncode)
-
+
#
# Check for input file argument
#
@@ -157,7 +157,7 @@ if __name__ == '__main__':
for Item in PublicKeyHash:
PublicKeyHashC = PublicKeyHashC + '0x%02x, ' % (ord(Item))
PublicKeyHashC = PublicKeyHashC[:-2] + '}'
-
+
#
# Write SHA 256 of 2048 bit binary public key to public key hash C structure file
#
@@ -166,9 +166,9 @@ if __name__ == '__main__':
args.PublicKeyHashCFile.close ()
except:
pass
-
+
#
# If verbose is enabled display the public key in C structure format
#
if args.Verbose:
- print 'PublicKeySha256 = ' + PublicKeyHashC
+ print 'PublicKeySha256 = ' + PublicKeyHashC
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index d36a14ffb775..1db9bb9f9705 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -20,7 +20,7 @@ Rsa2048Sha256Sign
import os
import sys
-import argparse
+import argparse
import subprocess
import uuid
import struct
@@ -60,7 +60,7 @@ TEST_SIGNING_PRIVATE_KEY_FILENAME = 'TestSigningPrivateKey.pem'
if __name__ == '__main__':
#
# Create command line argument parser object
- #
+ #
parser = argparse.ArgumentParser(prog=__prog__, version=__version__, usage=__usage__, description=__copyright__, conflict_handler='resolve')
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument("-e", action="store_true", dest='Encode', help='encode file')
@@ -75,7 +75,7 @@ if __name__ == '__main__':
#
# Parse command line arguments
- #
+ #
args = parser.parse_args()
#
@@ -95,19 +95,19 @@ if __name__ == '__main__':
#
try:
Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
- except:
+ except:
print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
sys.exit(1)
-
+
Version = Process.communicate()
if Process.returncode <> 0:
print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
sys.exit(Process.returncode)
print Version[0]
-
+
#
# Read input file into a buffer and save input filename
- #
+ #
args.InputFileName = args.InputFile.name
args.InputFileBuffer = args.InputFile.read()
args.InputFile.close()
@@ -173,17 +173,17 @@ if __name__ == '__main__':
if args.MonotonicCountStr:
format = "%dsQ" % len(args.InputFileBuffer)
FullInputFileBuffer = struct.pack(format, args.InputFileBuffer, args.MonotonicCountValue)
- #
+ #
# Sign the input file using the specified private key and capture signature from STDOUT
#
Process = subprocess.Popen('%s dgst -sha256 -sign "%s"' % (OpenSslCommand, args.PrivateKeyFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Signature = Process.communicate(input=FullInputFileBuffer)[0]
if Process.returncode <> 0:
sys.exit(Process.returncode)
-
+
#
# Write output file that contains hash GUID, Public Key, Signature, and Input data
- #
+ #
args.OutputFile = open(args.OutputFileName, 'wb')
args.OutputFile.write(EFI_HASH_ALGORITHM_SHA256_GUID.get_bytes_le())
args.OutputFile.write(PublicKey)
@@ -197,7 +197,7 @@ if __name__ == '__main__':
#
Header = EFI_CERT_BLOCK_RSA_2048_SHA256._make(EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT.unpack_from(args.InputFileBuffer))
args.InputFileBuffer = args.InputFileBuffer[EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT.size:]
-
+
#
# Verify that the Hash Type matches the expected SHA256 type
#
@@ -221,10 +221,10 @@ if __name__ == '__main__':
# Write Signature to output file
#
open(args.OutputFileName, 'wb').write(Header.Signature)
-
+
#
# Verify signature
- #
+ #
Process = subprocess.Popen('%s dgst -sha256 -prverify "%s" -signature %s' % (OpenSslCommand, args.PrivateKeyFileName, args.OutputFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Process.communicate(input=FullInputFileBuffer)
if Process.returncode <> 0:
@@ -233,6 +233,6 @@ if __name__ == '__main__':
sys.exit(Process.returncode)
#
- # Save output file contents from input file
- #
+ # Save output file contents from input file
+ #
open(args.OutputFileName, 'wb').write(args.InputFileBuffer)
diff --git a/BaseTools/Source/Python/Table/Table.py b/BaseTools/Source/Python/Table/Table.py
index c311df91c2ec..846f76718220 100644
--- a/BaseTools/Source/Python/Table/Table.py
+++ b/BaseTools/Source/Python/Table/Table.py
@@ -19,7 +19,7 @@ import Common.EdkLogger as EdkLogger
## TableFile
#
# This class defined a common table
-#
+#
# @param object: Inherited from object class
#
# @param Cursor: Cursor of the database
@@ -30,7 +30,7 @@ class Table(object):
self.Cur = Cursor
self.Table = ''
self.ID = 0
-
+
## Create table
#
# Create a table
@@ -46,18 +46,18 @@ class Table(object):
#
def Insert(self, SqlCommand):
self.Exec(SqlCommand)
-
+
## Query table
#
# Query all records of the table
- #
+ #
def Query(self):
EdkLogger.verbose("\nQuery tabel %s started ..." % self.Table)
SqlCommand = """select * from %s""" % self.Table
self.Cur.execute(SqlCommand)
for Rs in self.Cur:
EdkLogger.verbose(str(Rs))
-
+
TotalCount = self.GetCount()
EdkLogger.verbose("*** Total %s records in table %s ***" % (TotalCount, self.Table) )
EdkLogger.verbose("Query tabel %s DONE!" % self.Table)
@@ -70,7 +70,7 @@ class Table(object):
SqlCommand = """drop table IF EXISTS %s""" % self.Table
self.Cur.execute(SqlCommand)
EdkLogger.verbose("Drop tabel %s ... DONE!" % self.Table)
-
+
## Get count
#
# Get a count of all records of the table
@@ -82,12 +82,12 @@ class Table(object):
self.Cur.execute(SqlCommand)
for Item in self.Cur:
return Item[0]
-
+
## Generate ID
#
# Generate an ID if input ID is -1
#
- # @param ID: Input ID
+ # @param ID: Input ID
#
# @retval ID: New generated ID
#
@@ -96,14 +96,14 @@ class Table(object):
self.ID = self.ID + 1
return self.ID
-
+
## Init the ID of the table
#
# Init the ID of the table
#
def InitID(self):
self.ID = self.GetCount()
-
+
## Exec
#
# Exec Sql Command, return result
diff --git a/BaseTools/Source/Python/Table/TableDataModel.py b/BaseTools/Source/Python/Table/TableDataModel.py
index 9c3d7bd9345f..b939bc217518 100644
--- a/BaseTools/Source/Python/Table/TableDataModel.py
+++ b/BaseTools/Source/Python/Table/TableDataModel.py
@@ -22,7 +22,7 @@ from Common.String import ConvertToSqlString
## TableDataModel
#
# This class defined a table used for data model
-#
+#
# @param object: Inherited from object class
#
#
@@ -30,7 +30,7 @@ class TableDataModel(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'DataModel'
-
+
## Create table
#
# Create table DataModel
@@ -62,13 +62,13 @@ class TableDataModel(Table):
(Name, Description) = ConvertToSqlString((Name, Description))
SqlCommand = """insert into %s values(%s, %s, '%s', '%s')""" % (self.Table, self.ID, CrossIndex, Name, Description)
Table.Insert(self, SqlCommand)
-
+
return self.ID
-
+
## Init table
#
# Create all default records of table DataModel
- #
+ #
def InitTable(self):
EdkLogger.verbose("\nInitialize table DataModel started ...")
for Item in DataClass.MODEL_LIST:
@@ -77,7 +77,7 @@ class TableDataModel(Table):
Description = Item[0]
self.Insert(CrossIndex, Name, Description)
EdkLogger.verbose("Initialize table DataModel ... DONE!")
-
+
## Get CrossIndex
#
# Get a model's cross index from its name
@@ -91,5 +91,5 @@ class TableDataModel(Table):
self.Cur.execute(SqlCommand)
for Item in self.Cur:
CrossIndex = Item[0]
-
+
return CrossIndex
diff --git a/BaseTools/Source/Python/Table/TableDec.py b/BaseTools/Source/Python/Table/TableDec.py
index 6b7d22c9384c..9daa5a008e6c 100644
--- a/BaseTools/Source/Python/Table/TableDec.py
+++ b/BaseTools/Source/Python/Table/TableDec.py
@@ -22,7 +22,7 @@ from Common.String import ConvertToSqlString
## TableDec
#
# This class defined a table used for data model
-#
+#
# @param object: Inherited from object class
#
#
@@ -30,7 +30,7 @@ class TableDec(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'Dec'
-
+
## Create table
#
# Create table Dec
@@ -90,14 +90,14 @@ class TableDec(Table):
SqlCommand = """insert into %s values(%s, %s, '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s)""" \
% (self.Table, self.ID, Model, Value1, Value2, Value3, Arch, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
Table.Insert(self, SqlCommand)
-
+
return self.ID
-
+
## Query table
#
- # @param Model: The Model of Record
+ # @param Model: The Model of Record
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model):
SqlCommand = """select ID, Value1, Value2, Value3, Arch, BelongsToItem, BelongsToFile, StartLine from %s
diff --git a/BaseTools/Source/Python/Table/TableDsc.py b/BaseTools/Source/Python/Table/TableDsc.py
index 69477d544d8e..10d384dc39fa 100644
--- a/BaseTools/Source/Python/Table/TableDsc.py
+++ b/BaseTools/Source/Python/Table/TableDsc.py
@@ -22,7 +22,7 @@ from Common.String import ConvertToSqlString
## TableDsc
#
# This class defined a table used for data model
-#
+#
# @param object: Inherited from object class
#
#
@@ -30,7 +30,7 @@ class TableDsc(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'Dsc'
-
+
## Create table
#
# Create table Dsc
@@ -90,14 +90,14 @@ class TableDsc(Table):
SqlCommand = """insert into %s values(%s, %s, '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s)""" \
% (self.Table, self.ID, Model, Value1, Value2, Value3, Arch, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
Table.Insert(self, SqlCommand)
-
+
return self.ID
-
+
## Query table
#
- # @param Model: The Model of Record
+ # @param Model: The Model of Record
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model):
SqlCommand = """select ID, Value1, Value2, Value3, Arch, BelongsToItem, BelongsToFile, StartLine from %s
diff --git a/BaseTools/Source/Python/Table/TableEotReport.py b/BaseTools/Source/Python/Table/TableEotReport.py
index 740105c8f99d..35ac291eb443 100644
--- a/BaseTools/Source/Python/Table/TableEotReport.py
+++ b/BaseTools/Source/Python/Table/TableEotReport.py
@@ -24,7 +24,7 @@ import Eot.EotGlobalData as EotGlobalData
## TableReport
#
# This class defined a table used for data model
-#
+#
# @param object: Inherited from object class
#
#
@@ -32,7 +32,7 @@ class TableEotReport(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'Report'
-
+
## Create table
#
# Create table report
@@ -68,7 +68,7 @@ class TableEotReport(Table):
% (self.Table, self.ID, ModuleID, ModuleName, ModuleGuid, SourceFileID, SourceFileFullPath, \
ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, Enabled)
Table.Insert(self, SqlCommand)
-
+
def GetMaxID(self):
SqlCommand = """select max(ID) from %s""" % self.Table
self.Cur.execute(SqlCommand)
diff --git a/BaseTools/Source/Python/Table/TableFdf.py b/BaseTools/Source/Python/Table/TableFdf.py
index 927b5d1a3be6..2ee836e93b0a 100644
--- a/BaseTools/Source/Python/Table/TableFdf.py
+++ b/BaseTools/Source/Python/Table/TableFdf.py
@@ -22,7 +22,7 @@ from Common.String import ConvertToSqlString
## TableFdf
#
# This class defined a table used for data model
-#
+#
# @param object: Inherited from object class
#
#
@@ -30,7 +30,7 @@ class TableFdf(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'Fdf'
-
+
## Create table
#
# Create table Fdf
@@ -91,14 +91,14 @@ class TableFdf(Table):
SqlCommand = """insert into %s values(%s, %s, '%s', '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s)""" \
% (self.Table, self.ID, Model, Value1, Value2, Value3, Scope1, Scope2, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
Table.Insert(self, SqlCommand)
-
+
return self.ID
-
+
## Query table
#
- # @param Model: The Model of Record
+ # @param Model: The Model of Record
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model):
SqlCommand = """select ID, Value1, Value2, Value3, Scope1, Scope2, BelongsToItem, BelongsToFile, StartLine from %s
diff --git a/BaseTools/Source/Python/Table/TableFile.py b/BaseTools/Source/Python/Table/TableFile.py
index caf749e9d3c5..723b19b69d81 100644
--- a/BaseTools/Source/Python/Table/TableFile.py
+++ b/BaseTools/Source/Python/Table/TableFile.py
@@ -23,14 +23,14 @@ from CommonDataClass.DataClass import FileClass
## TableFile
#
# This class defined a table used for file
-#
+#
# @param object: Inherited from object class
#
class TableFile(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'File'
-
+
## Create table
#
# Create table File
@@ -72,15 +72,15 @@ class TableFile(Table):
SqlCommand = """insert into %s values(%s, '%s', '%s', '%s', '%s', %s, '%s')""" \
% (self.Table, self.ID, Name, ExtName, Path, FullPath, Model, TimeStamp)
Table.Insert(self, SqlCommand)
-
+
return self.ID
## InsertFile
#
# Insert one file to table
#
# @param FileFullPath: The full path of the file
- # @param Model: The model of the file
- #
+ # @param Model: The model of the file
+ #
# @retval FileID: The ID after record is inserted
#
def InsertFile(self, FileFullPath, Model):
@@ -89,7 +89,7 @@ class TableFile(Table):
TimeStamp = os.stat(FileFullPath)[8]
File = FileClass(-1, Name, Ext, Filepath, FileFullPath, Model, '', [], [], [])
return self.Insert(File.Name, File.ExtName, File.Path, File.FullPath, File.Model, TimeStamp)
-
+
## Get ID of a given file
#
# @param FilePath Path of file
diff --git a/BaseTools/Source/Python/Table/TableFunction.py b/BaseTools/Source/Python/Table/TableFunction.py
index 3d7c2d0ea5a0..af483ebd8c12 100644
--- a/BaseTools/Source/Python/Table/TableFunction.py
+++ b/BaseTools/Source/Python/Table/TableFunction.py
@@ -21,21 +21,21 @@ from Common.String import ConvertToSqlString
## TableFunction
#
# This class defined a table used for function
-#
+#
# @param Table: Inherited from Table class
#
class TableFunction(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'Function'
-
+
## Create table
#
# Create table Function
#
# @param ID: ID of a Function
# @param Header: Header of a Function
- # @param Modifier: Modifier of a Function
+ # @param Modifier: Modifier of a Function
# @param Name: Name of a Function
# @param ReturnStatement: ReturnStatement of a Funciont
# @param StartLine: StartLine of a Function
@@ -72,7 +72,7 @@ class TableFunction(Table):
#
# @param ID: ID of a Function
# @param Header: Header of a Function
- # @param Modifier: Modifier of a Function
+ # @param Modifier: Modifier of a Function
# @param Name: Name of a Function
# @param ReturnStatement: ReturnStatement of a Funciont
# @param StartLine: StartLine of a Function
diff --git a/BaseTools/Source/Python/Table/TableIdentifier.py b/BaseTools/Source/Python/Table/TableIdentifier.py
index bcd6d6e1c152..d90035c6df82 100644
--- a/BaseTools/Source/Python/Table/TableIdentifier.py
+++ b/BaseTools/Source/Python/Table/TableIdentifier.py
@@ -21,7 +21,7 @@ from Table import Table
## TableIdentifier
#
# This class defined a table used for Identifier
-#
+#
# @param object: Inherited from object class
#
#
@@ -29,7 +29,7 @@ class TableIdentifier(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'Identifier'
-
+
## Create table
#
# Create table Identifier
diff --git a/BaseTools/Source/Python/Table/TableInf.py b/BaseTools/Source/Python/Table/TableInf.py
index b6e300b150c1..1480af39d718 100644
--- a/BaseTools/Source/Python/Table/TableInf.py
+++ b/BaseTools/Source/Python/Table/TableInf.py
@@ -22,7 +22,7 @@ from Common.String import ConvertToSqlString
## TableInf
#
# This class defined a table used for data model
-#
+#
# @param object: Inherited from object class
#
#
@@ -30,7 +30,7 @@ class TableInf(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'Inf'
-
+
## Create table
#
# Create table Inf
@@ -96,14 +96,14 @@ class TableInf(Table):
SqlCommand = """insert into %s values(%s, %s, '%s', '%s', '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s)""" \
% (self.Table, self.ID, Model, Value1, Value2, Value3, Value4, Value5, Arch, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
Table.Insert(self, SqlCommand)
-
+
return self.ID
-
+
## Query table
#
- # @param Model: The Model of Record
+ # @param Model: The Model of Record
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model):
SqlCommand = """select ID, Value1, Value2, Value3, Arch, BelongsToItem, BelongsToFile, StartLine from %s
diff --git a/BaseTools/Source/Python/Table/TablePcd.py b/BaseTools/Source/Python/Table/TablePcd.py
index 19623f98f42c..7ea521517199 100644
--- a/BaseTools/Source/Python/Table/TablePcd.py
+++ b/BaseTools/Source/Python/Table/TablePcd.py
@@ -21,7 +21,7 @@ from Common.String import ConvertToSqlString
## TablePcd
#
# This class defined a table used for pcds
-#
+#
# @param object: Inherited from object class
#
#
@@ -29,7 +29,7 @@ class TablePcd(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'Pcd'
-
+
## Create table
#
# Create table Pcd
diff --git a/BaseTools/Source/Python/Table/TableReport.py b/BaseTools/Source/Python/Table/TableReport.py
index 4af0e98d86b4..aec7c7203c44 100644
--- a/BaseTools/Source/Python/Table/TableReport.py
+++ b/BaseTools/Source/Python/Table/TableReport.py
@@ -25,7 +25,7 @@ from Common.LongFilePathSupport import OpenLongFilePath as open
## TableReport
#
# This class defined a table used for data model
-#
+#
# @param object: Inherited from object class
#
#
@@ -33,7 +33,7 @@ class TableReport(Table):
def __init__(self, Cursor):
Table.__init__(self, Cursor)
self.Table = 'Report'
-
+
## Create table
#
# Create table report
@@ -78,7 +78,7 @@ class TableReport(Table):
## Query table
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self):
SqlCommand = """select ID, ErrorID, OtherMsg, BelongsToTable, BelongsToItem, Corrected from %s
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index ecac316b7a3a..113ef11f91ba 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -85,13 +85,13 @@ class TargetTool():
if type(self.TargetTxtDictionary[Key]) == type([]):
print "%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key]))
elif self.TargetTxtDictionary[Key] is None:
- errMsg += " Missing %s configuration information, please use TargetTool to set value!" % Key + os.linesep
+ errMsg += " Missing %s configuration information, please use TargetTool to set value!" % Key + os.linesep
else:
print "%-30s = %s" % (Key, self.TargetTxtDictionary[Key])
-
+
if errMsg != '':
print os.linesep + 'Warning:' + os.linesep + errMsg
-
+
def RWFile(self, CommentCharacter, KeySplitCharacter, Num):
try:
fr = open(self.FileName, 'r')
@@ -110,7 +110,7 @@ class TargetTool():
existKeys.append(Key)
else:
print "Warning: Found duplicate key item in original configuration files!"
-
+
if Num == 0:
Line = "%-30s = \n" % Key
else:
@@ -125,12 +125,12 @@ class TargetTool():
if Line is None:
Line = "%-30s = " % key
fw.write(Line)
-
+
fr.close()
fw.close()
os.remove(self.FileName)
os.rename(os.path.normpath(os.path.join(self.WorkSpace, 'Conf\\targetnew.txt')), self.FileName)
-
+
except:
last_type, last_value, last_tb = sys.exc_info()
traceback.print_exception(last_type, last_value, last_tb)
@@ -142,20 +142,20 @@ def GetConfigureKeyValue(self, Key):
if os.path.exists(dscFullPath):
Line = "%-30s = %s\n" % (Key, self.Opt.DSCFILE)
else:
- EdkLogger.error("TagetTool", BuildToolError.FILE_NOT_FOUND,
+ EdkLogger.error("TagetTool", BuildToolError.FILE_NOT_FOUND,
"DSC file %s does not exist!" % self.Opt.DSCFILE, RaiseError=False)
elif Key == TAB_TAT_DEFINES_TOOL_CHAIN_CONF and self.Opt.TOOL_DEFINITION_FILE is not None:
tooldefFullPath = os.path.join(self.WorkSpace, self.Opt.TOOL_DEFINITION_FILE)
if os.path.exists(tooldefFullPath):
Line = "%-30s = %s\n" % (Key, self.Opt.TOOL_DEFINITION_FILE)
else:
- EdkLogger.error("TagetTool", BuildToolError.FILE_NOT_FOUND,
+ EdkLogger.error("TagetTool", BuildToolError.FILE_NOT_FOUND,
"Tooldef file %s does not exist!" % self.Opt.TOOL_DEFINITION_FILE, RaiseError=False)
elif self.Opt.NUM >= 2:
Line = "%-30s = %s\n" % (Key, 'Enable')
elif self.Opt.NUM <= 1:
- Line = "%-30s = %s\n" % (Key, 'Disable')
+ Line = "%-30s = %s\n" % (Key, 'Disable')
elif Key == TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER and self.Opt.NUM is not None:
Line = "%-30s = %s\n" % (Key, str(self.Opt.NUM))
elif Key == TAB_TAT_DEFINES_TARGET and self.Opt.TARGET is not None:
@@ -169,7 +169,7 @@ def GetConfigureKeyValue(self, Key):
if os.path.exists(buildruleFullPath):
Line = "%-30s = %s\n" % (Key, self.Opt.BUILD_RULE_FILE)
else:
- EdkLogger.error("TagetTool", BuildToolError.FILE_NOT_FOUND,
+ EdkLogger.error("TagetTool", BuildToolError.FILE_NOT_FOUND,
"Build rule file %s does not exist!" % self.Opt.BUILD_RULE_FILE, RaiseError=False)
return Line
@@ -199,7 +199,7 @@ def RangeCheckCallback(option, opt_str, value, parser):
setattr(parser.values, option.dest, value)
else:
parser.error("Option %s only allows one instance in command line!" % option)
-
+
def MyOptionParser():
parser = OptionParser(version=__version__,prog="TargetTool.exe",usage=__usage__,description=__copyright__)
parser.add_option("-a", "--arch", action="append", type="choice", choices=['IA32','X64','IPF','EBC', 'ARM', 'AARCH64','0'], dest="TARGET_ARCH",
@@ -225,7 +225,7 @@ if __name__ == '__main__':
if os.getenv('WORKSPACE') is None:
print "ERROR: WORKSPACE should be specified or edksetup script should be executed before run TargetTool"
sys.exit(1)
-
+
(opt, args) = MyOptionParser()
if len(args) != 1 or (args[0].lower() != 'print' and args[0].lower() != 'clean' and args[0].lower() != 'set'):
print "The number of args isn't 1 or the value of args is invalid."
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index 3eb7fa39209d..2b3786d78cb4 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -258,7 +258,7 @@ def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
#
def TrimPreprocessedVfr(Source, Target):
CreateDirectory(os.path.dirname(Target))
-
+
try:
f = open (Source,'r')
except:
@@ -335,7 +335,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
SearchPathList = [LocalSearchPath] + IncludePathList
else:
SearchPathList = IncludePathList
-
+
for IncludePath in SearchPathList:
IncludeFile = os.path.join(IncludePath, Source)
if os.path.isfile(IncludeFile):
@@ -346,7 +346,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
except:
EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
-
+
# avoid A "include" B and B "include" A
IncludeFile = os.path.abspath(os.path.normpath(IncludeFile))
if IncludeFile in gIncludedAslFile:
@@ -354,7 +354,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
ExtraData= "%s -> %s" % (" -> ".join(gIncludedAslFile), IncludeFile))
return []
gIncludedAslFile.append(IncludeFile)
-
+
for Line in F:
LocalSearchPath = None
Result = gAslIncludePattern.findall(Line)
@@ -364,7 +364,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
NewFileContent.append("%s%s" % (Indent, Line))
continue
#
- # We should first search the local directory if current file are using pattern #include "XXX"
+ # We should first search the local directory if current file are using pattern #include "XXX"
#
if Result[0][2] == '"':
LocalSearchPath = os.path.dirname(IncludeFile)
@@ -385,20 +385,20 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
#
# @param Source File to be trimmed
# @param Target File to store the trimmed content
-# @param IncludePathFile The file to log the external include path
+# @param IncludePathFile The file to log the external include path
#
def TrimAslFile(Source, Target, IncludePathFile):
CreateDirectory(os.path.dirname(Target))
-
+
SourceDir = os.path.dirname(Source)
if SourceDir == '':
SourceDir = '.'
-
+
#
# Add source directory as the first search directory
#
IncludePathList = [SourceDir]
-
+
#
# If additional include path file is specified, append them all
# to the search directory list.
@@ -669,7 +669,7 @@ def Main():
EdkLogger.SetLevel(CommandOptions.LogLevel)
except FatalError, X:
return 1
-
+
try:
if CommandOptions.FileType == "Vfr":
if CommandOptions.OutputFile is None:
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 13a1ed886cc4..48690aa357f7 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1059,7 +1059,7 @@ class DscBuildData(PlatformBuildClassObject):
return PcdValue
try:
PcdValue = ValueExpressionEx(PcdValue[1:], PcdDatumType, GuidDict)(True)
- except BadExpression, Value:
+ except BadExpression, Value:
EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %
(TokenSpaceGuidCName, TokenCName, PcdValue, Value))
elif PcdValue.startswith("L'") or PcdValue.startswith("'"):
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 550359f9abb2..2226f707c7aa 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -371,7 +371,7 @@ class MetaFileParser(object):
# Sometimes, we need to make differences between EDK and EDK2 modules
if Name == 'INF_VERSION':
if hexVersionPattern.match(Value):
- self._Version = int(Value, 0)
+ self._Version = int(Value, 0)
elif decVersionPattern.match(Value):
ValueList = Value.split('.')
Major = '%04o' % int(ValueList[0], 0)
diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools/Source/Python/Workspace/MetaFileTable.py
index 3c8dae0e622f..5aa5d67b8239 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py
@@ -56,7 +56,7 @@ class MetaFileTable(Table):
Result = self.Cur.execute("select ID from %s where ID<0" % (self.Table)).fetchall()
if not Result:
# update the timestamp in database
- self._FileIndexTable.SetFileTimeStamp(self.IdBase, TimeStamp)
+ self._FileIndexTable.SetFileTimeStamp(self.IdBase, TimeStamp)
return False
if TimeStamp != self._FileIndexTable.GetFileTimeStamp(self.IdBase):
@@ -113,28 +113,28 @@ class ModuleTable(MetaFileTable):
BelongsToItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0):
(Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2))
return Table.Insert(
- self,
- Model,
- Value1,
- Value2,
- Value3,
- Scope1,
+ self,
+ Model,
+ Value1,
+ Value2,
+ Value3,
+ Scope1,
Scope2,
- BelongsToItem,
- StartLine,
- StartColumn,
- EndLine,
- EndColumn,
+ BelongsToItem,
+ StartLine,
+ StartColumn,
+ EndLine,
+ EndColumn,
Enabled
)
## Query table
#
- # @param Model: The Model of Record
- # @param Arch: The Arch attribute of Record
- # @param Platform The Platform attribute of Record
+ # @param Model: The Model of Record
+ # @param Arch: The Arch attribute of Record
+ # @param Platform The Platform attribute of Record
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model, Arch=None, Platform=None, BelongsToItem=None):
ConditionString = "Model=%s AND Enabled>=0" % Model
@@ -195,27 +195,27 @@ class PackageTable(MetaFileTable):
BelongsToItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0):
(Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2))
return Table.Insert(
- self,
- Model,
- Value1,
- Value2,
- Value3,
- Scope1,
+ self,
+ Model,
+ Value1,
+ Value2,
+ Value3,
+ Scope1,
Scope2,
- BelongsToItem,
- StartLine,
- StartColumn,
- EndLine,
- EndColumn,
+ BelongsToItem,
+ StartLine,
+ StartColumn,
+ EndLine,
+ EndColumn,
Enabled
)
## Query table
#
- # @param Model: The Model of Record
- # @param Arch: The Arch attribute of Record
+ # @param Model: The Model of Record
+ # @param Arch: The Arch attribute of Record
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model, Arch=None):
ConditionString = "Model=%s AND Enabled>=0" % Model
@@ -236,7 +236,7 @@ class PackageTable(MetaFileTable):
try:
for row in self.Cur:
comment = row[0]
-
+
LineNum = row[1]
comment = comment.strip("#")
comment = comment.strip()
@@ -310,32 +310,32 @@ class PlatformTable(MetaFileTable):
FromItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=1):
(Value1, Value2, Value3, Scope1, Scope2,Scope3) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2,Scope3))
return Table.Insert(
- self,
- Model,
- Value1,
- Value2,
- Value3,
- Scope1,
+ self,
+ Model,
+ Value1,
+ Value2,
+ Value3,
+ Scope1,
Scope2,
Scope3,
- BelongsToItem,
+ BelongsToItem,
FromItem,
- StartLine,
- StartColumn,
- EndLine,
- EndColumn,
+ StartLine,
+ StartColumn,
+ EndLine,
+ EndColumn,
Enabled
)
## Query table
#
- # @param Model: The Model of Record
+ # @param Model: The Model of Record
# @param Scope1: Arch of a Dsc item
# @param Scope2: Module type of a Dsc item
# @param BelongsToItem: The item belongs to which another item
# @param FromItem: The item belongs to which dsc file
#
- # @retval: A recordSet of all found records
+ # @retval: A recordSet of all found records
#
def Query(self, Model, Scope1=None, Scope2=None, BelongsToItem=None, FromItem=None):
ConditionString = "Model=%s AND Enabled>0" % Model
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
index 14dcb1ae8136..fccd2ebfb8e8 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
@@ -114,8 +114,8 @@ class WorkspaceDatabase(object):
# get the parser ready for this file
MetaFile = self._FILE_PARSER_[FileType](
- FilePath,
- FileType,
+ FilePath,
+ FileType,
Arch,
MetaFileStorage(self.WorkspaceDb.Cur, FilePath, FileType)
)
@@ -162,7 +162,7 @@ class WorkspaceDatabase(object):
# remove db file in case inconsistency between db and file in file system
if self._CheckWhetherDbNeedRenew(RenewDb, DbPath):
os.remove(DbPath)
-
+
# create db with optimized parameters
self.Conn = sqlite3.connect(DbPath, isolation_level='DEFERRED')
self.Conn.execute("PRAGMA synchronous=OFF")
@@ -199,11 +199,11 @@ class WorkspaceDatabase(object):
def _CheckWhetherDbNeedRenew (self, force, DbPath):
# if database does not exist, we need do nothing
if not os.path.exists(DbPath): return False
-
+
# if user force to renew database, then not check whether database is out of date
if force: return True
-
- #
+
+ #
# Check the time of last modified source file or build.exe
# if is newer than time of database, then database need to be re-created.
#
@@ -217,15 +217,15 @@ class WorkspaceDatabase(object):
if rootPath == "" or rootPath is None:
EdkLogger.verbose("\nFail to find the root path of build.exe or python sources, so can not \
determine whether database file is out of date!\n")
-
+
# walk the root path of source or build's binary to get the time last modified.
-
+
for root, dirs, files in os.walk (rootPath):
for dir in dirs:
- # bypass source control folder
+ # bypass source control folder
if dir.lower() in [".svn", "_svn", "cvs"]:
dirs.remove(dir)
-
+
for file in files:
ext = os.path.splitext(file)[1]
if ext.lower() == ".py": # only check .py files
@@ -235,9 +235,9 @@ determine whether database file is out of date!\n")
if timeOfToolModified > os.stat(DbPath).st_mtime:
EdkLogger.verbose("\nWorkspace database is out of data!")
return True
-
+
return False
-
+
## Initialize build database
def InitDatabase(self):
EdkLogger.verbose("\nInitialize build database started ...")
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index c4647d068a6b..bcbe6f89b48b 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -235,7 +235,7 @@ def FindIncludeFiles(Source, IncludePathList, IncludeFiles):
## Split each lines in file
#
-# This method is used to split the lines in file to make the length of each line
+# This method is used to split the lines in file to make the length of each line
# less than MaxLength.
#
# @param Content The content of file
@@ -260,12 +260,12 @@ def FileLinesSplit(Content=None, MaxLength=None):
NewContentList.append(Line)
for NewLine in NewContentList:
NewContent += NewLine + TAB_LINE_BREAK
-
+
NewContent = NewContent.replace(TAB_LINE_BREAK, gEndOfLine).replace('\r\r\n', gEndOfLine)
return NewContent
-
-
-
+
+
+
##
# Parse binary dependency expression section
#
@@ -295,10 +295,10 @@ class DepexParser(object):
for Guid in Package.Guids:
GuidValue = GuidStructureStringToGuidString(Package.Guids[Guid])
self._GuidDb[GuidValue.upper()] = Guid
-
+
##
# Parse the binary dependency expression files.
- #
+ #
# This function parses the binary dependency expression file and translate it
# to the instruction list.
#
@@ -320,7 +320,7 @@ class DepexParser(object):
OpCode = DepexFile.read(1)
return DepexStatement
-
+
##
# Reports library information
#
@@ -426,7 +426,7 @@ class DepexReport(object):
if ModuleType in ["SEC", "PEI_CORE", "DXE_CORE", "SMM_CORE", "MM_CORE_STANDALONE", "UEFI_APPLICATION"]:
return
-
+
for Source in M.SourceFileList:
if os.path.splitext(Source.Path)[1].lower() == ".dxs":
Match = gDxsDependencyPattern.search(open(Source.Path).read())
@@ -472,7 +472,7 @@ class DepexReport(object):
FileWrite(File, gSubSectionSep)
except:
EdkLogger.warn(None, "Dependency expression file is corrupted", self._DepexFileName)
-
+
FileWrite(File, "Dependency Expression (DEPEX) from %s" % self.Source)
if self.Source == "INF":
@@ -977,7 +977,7 @@ class PcdReport(object):
EdkLogger.error('BuildReport', FORMAT_INVALID, "PCD Value: %s, Type: %s" %(DscDefaultValue, Pcd.DatumType))
InfDefaultValue = None
-
+
PcdValue = DecDefaultValue
if DscDefaultValue:
PcdValue = DscDefaultValue
@@ -1519,7 +1519,7 @@ class PredictionReport(object):
EotEndTime = time.time()
EotDuration = time.strftime("%H:%M:%S", time.gmtime(int(round(EotEndTime - EotStartTime))))
EdkLogger.quiet("EOT run time: %s\n" % EotDuration)
-
+
#
# Parse the output of EOT tool
#
@@ -1717,7 +1717,7 @@ class FdRegionReport(object):
PlatformPcds = {}
#
# Collect PCDs declared in DEC files.
- #
+ #
for Pa in Wa.AutoGenObjectList:
for Package in Pa.PackageList:
for (TokenCName, TokenSpaceGuidCName, DecType) in Package.Pcds:
@@ -1998,7 +1998,7 @@ class PlatformReport(object):
self.DepexParser = None
if "DEPEX" in ReportType:
self.DepexParser = DepexParser(Wa)
-
+
self.ModuleReportList = []
if MaList is not None:
self._IsModuleBuild = True
@@ -2073,7 +2073,7 @@ class PlatformReport(object):
if not self._IsModuleBuild:
if "PCD" in ReportType:
self.PcdReport.GenerateReport(File, None)
-
+
if "FLASH" in ReportType:
for FdReportListItem in self.FdReportList:
FdReportListItem.GenerateReport(File)
@@ -2107,7 +2107,7 @@ class BuildReport(object):
if ReportFile:
self.ReportList = []
self.ReportType = []
- if ReportType:
+ if ReportType:
for ReportTypeItem in ReportType:
if ReportTypeItem not in self.ReportType:
self.ReportType.append(ReportTypeItem)
@@ -2153,7 +2153,7 @@ class BuildReport(object):
EdkLogger.error("BuildReport", CODE_ERROR, "Unknown fatal error when generating build report", ExtraData=self.ReportFile, RaiseError=False)
EdkLogger.quiet("(Python %s on %s\n%s)" % (platform.python_version(), sys.platform, traceback.format_exc()))
File.close()
-
+
# This acts like the main() function for the script, unless it is 'import'ed into another script.
if __name__ == '__main__':
pass
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 1c26e72feb6b..0ca78c1fa451 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -110,7 +110,7 @@ def CheckEnvVariable():
EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "No space is allowed in WORKSPACE path",
ExtraData=WorkspaceDir)
os.environ["WORKSPACE"] = WorkspaceDir
-
+
# set multiple workspace
PackagesPath = os.getenv("PACKAGES_PATH")
mws.setWs(WorkspaceDir, PackagesPath)
@@ -200,7 +200,7 @@ def CheckEnvVariable():
GlobalData.gGlobalDefines["EDK_SOURCE"] = EdkSourceDir
GlobalData.gGlobalDefines["ECP_SOURCE"] = EcpSourceDir
GlobalData.gGlobalDefines["EDK_TOOLS_PATH"] = os.environ["EDK_TOOLS_PATH"]
-
+
## Get normalized file path
#
# Convert the path to be local format, and remove the WORKSPACE path at the
@@ -265,7 +265,7 @@ def LaunchCommand(Command, WorkingDir):
# if working directory doesn't exist, Popen() will raise an exception
if not os.path.isdir(WorkingDir):
EdkLogger.error("build", FILE_NOT_FOUND, ExtraData=WorkingDir)
-
+
# Command is used as the first Argument in following Popen().
# It could be a string or sequence. We find that if command is a string in following Popen(),
# ubuntu may fail with an error message that the command is not found.
@@ -848,14 +848,14 @@ class Build():
# print current build environment and configuration
EdkLogger.quiet("%-16s = %s" % ("WORKSPACE", os.environ["WORKSPACE"]))
if "PACKAGES_PATH" in os.environ:
- # WORKSPACE env has been converted before. Print the same path style with WORKSPACE env.
+ # WORKSPACE env has been converted before. Print the same path style with WORKSPACE env.
EdkLogger.quiet("%-16s = %s" % ("PACKAGES_PATH", os.path.normcase(os.path.normpath(os.environ["PACKAGES_PATH"]))))
EdkLogger.quiet("%-16s = %s" % ("ECP_SOURCE", os.environ["ECP_SOURCE"]))
EdkLogger.quiet("%-16s = %s" % ("EDK_SOURCE", os.environ["EDK_SOURCE"]))
EdkLogger.quiet("%-16s = %s" % ("EFI_SOURCE", os.environ["EFI_SOURCE"]))
EdkLogger.quiet("%-16s = %s" % ("EDK_TOOLS_PATH", os.environ["EDK_TOOLS_PATH"]))
if "EDK_TOOLS_BIN" in os.environ:
- # Print the same path style with WORKSPACE env.
+ # Print the same path style with WORKSPACE env.
EdkLogger.quiet("%-16s = %s" % ("EDK_TOOLS_BIN", os.path.normcase(os.path.normpath(os.environ["EDK_TOOLS_BIN"]))))
EdkLogger.quiet("%-16s = %s" % ("CONF_PATH", GlobalData.gConfDirectory))
self.InitPreBuild()
@@ -1961,7 +1961,7 @@ class Build():
self._SaveMapFile (MapBuffer, Wa)
def _GenFfsCmd(self):
- # convert dictionary of Cmd:(Inf,Arch)
+ # convert dictionary of Cmd:(Inf,Arch)
# to a new dictionary of (Inf,Arch):Cmd,Cmd,Cmd...
CmdSetDict = defaultdict(set)
GenFfsDict = GenFds.GenFfsMakefile('', GlobalData.gFdfParser, self, self.ArchList, GlobalData)
@@ -2033,7 +2033,7 @@ class Build():
for Module in ModuleList:
# Get ModuleAutoGen object to generate C code file and makefile
Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile)
-
+
if Ma is None:
continue
if Ma.CanSkipbyHash():
diff --git a/BaseTools/Source/Python/sitecustomize.py b/BaseTools/Source/Python/sitecustomize.py
index 4ea84c512969..3afa90700e30 100644
--- a/BaseTools/Source/Python/sitecustomize.py
+++ b/BaseTools/Source/Python/sitecustomize.py
@@ -16,6 +16,6 @@ import locale
if sys.platform == "darwin":
DefaultLocal = locale.getdefaultlocale()[1]
if DefaultLocal is None:
- DefaultLocal = 'UTF8'
+ DefaultLocal = 'UTF8'
sys.setdefaultencoding(DefaultLocal)
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 33/42] BaseTools: AutoGen - add Opcode constants
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (31 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 32/42] BaseTools: trim whitespace Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 34/42] BaseTools: standardize GUID and pack size Jaben Carsey
` (9 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
add constants for dependency expression opcode strings
use these new opcode string constants
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenDepex.py | 112 ++++++++++----------
BaseTools/Source/Python/Common/DataType.py | 12 +++
2 files changed, 67 insertions(+), 57 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 9acea8f6bfed..3dcbad5be666 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -1,7 +1,7 @@
## @file
# This file is used to generate DEPEX file for module's dependency expression
#
-# Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
# This program and the accompanying materials
# are licensed and made available under the terms and conditions of the BSD License
# which accompanies this distribution. The full text of the license may be found at
@@ -24,6 +24,7 @@ from Common.Misc import SaveFileOnChange
from Common.Misc import GuidStructureStringToGuidString
from Common import EdkLogger as EdkLogger
from Common.BuildVersion import gBUILD_VERSION
+from Common.DataType import *
## Regular expression for matching "DEPENDENCY_START ... DEPENDENCY_END"
gStartClosePattern = re.compile(".*DEPENDENCY_START(.+)DEPENDENCY_END.*", re.S)
@@ -70,65 +71,62 @@ class DependencyExpression:
)
OpcodePriority = {
- "AND" : 1,
- "OR" : 1,
- "NOT" : 2,
- # "SOR" : 9,
- # "BEFORE": 9,
- # "AFTER" : 9,
+ DEPEX_OPCODE_AND : 1,
+ DEPEX_OPCODE_OR : 1,
+ DEPEX_OPCODE_NOT : 2,
}
Opcode = {
"PEI" : {
- "PUSH" : 0x02,
- "AND" : 0x03,
- "OR" : 0x04,
- "NOT" : 0x05,
- "TRUE" : 0x06,
- "FALSE" : 0x07,
- "END" : 0x08
+ DEPEX_OPCODE_PUSH : 0x02,
+ DEPEX_OPCODE_AND : 0x03,
+ DEPEX_OPCODE_OR : 0x04,
+ DEPEX_OPCODE_NOT : 0x05,
+ DEPEX_OPCODE_TRUE : 0x06,
+ DEPEX_OPCODE_FALSE : 0x07,
+ DEPEX_OPCODE_END : 0x08
},
"DXE" : {
- "BEFORE": 0x00,
- "AFTER" : 0x01,
- "PUSH" : 0x02,
- "AND" : 0x03,
- "OR" : 0x04,
- "NOT" : 0x05,
- "TRUE" : 0x06,
- "FALSE" : 0x07,
- "END" : 0x08,
- "SOR" : 0x09
+ DEPEX_OPCODE_BEFORE: 0x00,
+ DEPEX_OPCODE_AFTER : 0x01,
+ DEPEX_OPCODE_PUSH : 0x02,
+ DEPEX_OPCODE_AND : 0x03,
+ DEPEX_OPCODE_OR : 0x04,
+ DEPEX_OPCODE_NOT : 0x05,
+ DEPEX_OPCODE_TRUE : 0x06,
+ DEPEX_OPCODE_FALSE : 0x07,
+ DEPEX_OPCODE_END : 0x08,
+ DEPEX_OPCODE_SOR : 0x09
},
"MM" : {
- "BEFORE": 0x00,
- "AFTER" : 0x01,
- "PUSH" : 0x02,
- "AND" : 0x03,
- "OR" : 0x04,
- "NOT" : 0x05,
- "TRUE" : 0x06,
- "FALSE" : 0x07,
- "END" : 0x08,
- "SOR" : 0x09
+ DEPEX_OPCODE_BEFORE: 0x00,
+ DEPEX_OPCODE_AFTER : 0x01,
+ DEPEX_OPCODE_PUSH : 0x02,
+ DEPEX_OPCODE_AND : 0x03,
+ DEPEX_OPCODE_OR : 0x04,
+ DEPEX_OPCODE_NOT : 0x05,
+ DEPEX_OPCODE_TRUE : 0x06,
+ DEPEX_OPCODE_FALSE : 0x07,
+ DEPEX_OPCODE_END : 0x08,
+ DEPEX_OPCODE_SOR : 0x09
}
}
# all supported op codes and operands
- SupportedOpcode = ["BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT", "END", "SOR"]
- SupportedOperand = ["TRUE", "FALSE"]
+ SupportedOpcode = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, DEPEX_OPCODE_PUSH, DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_END, DEPEX_OPCODE_SOR]
+ SupportedOperand = [DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE]
- OpcodeWithSingleOperand = ['NOT', 'BEFORE', 'AFTER']
- OpcodeWithTwoOperand = ['AND', 'OR']
+ OpcodeWithSingleOperand = [DEPEX_OPCODE_NOT, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
+ OpcodeWithTwoOperand = [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]
# op code that should not be the last one
- NonEndingOpcode = ["AND", "OR", "NOT", 'SOR']
+ NonEndingOpcode = [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_SOR]
# op code must not present at the same time
- ExclusiveOpcode = ["BEFORE", "AFTER"]
+ ExclusiveOpcode = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
# op code that should be the first one if it presents
- AboveAllOpcode = ["SOR", "BEFORE", "AFTER"]
+ AboveAllOpcode = [DEPEX_OPCODE_SOR, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
#
# open and close brace must be taken as individual tokens
@@ -200,7 +198,7 @@ class DependencyExpression:
break
self.PostfixNotation.append(Stack.pop())
elif Token in self.OpcodePriority:
- if Token == "NOT":
+ if Token == DEPEX_OPCODE_NOT:
if LastToken not in self.SupportedOpcode + ['(', '', None]:
EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operator before NOT",
ExtraData="Near %s" % LastToken)
@@ -222,10 +220,10 @@ class DependencyExpression:
ExtraData="Near %s" % LastToken)
if len(self.OpcodeList) == 0 or self.OpcodeList[-1] not in self.ExclusiveOpcode:
if Token not in self.SupportedOperand:
- self.PostfixNotation.append("PUSH")
+ self.PostfixNotation.append(DEPEX_OPCODE_PUSH)
# check if OP is valid in this phase
elif Token in self.Opcode[self.Phase]:
- if Token == "END":
+ if Token == DEPEX_OPCODE_END:
break
self.OpcodeList.append(Token)
else:
@@ -241,8 +239,8 @@ class DependencyExpression:
ExtraData=str(self))
while len(Stack) > 0:
self.PostfixNotation.append(Stack.pop())
- if self.PostfixNotation[-1] != 'END':
- self.PostfixNotation.append("END")
+ if self.PostfixNotation[-1] != DEPEX_OPCODE_END:
+ self.PostfixNotation.append(DEPEX_OPCODE_END)
## Validate the dependency expression
def ValidateOpcode(self):
@@ -262,20 +260,20 @@ class DependencyExpression:
if len(self.PostfixNotation) < 3:
EdkLogger.error("GenDepex", PARSER_ERROR, "Missing operand for %s" % Op,
ExtraData=str(self))
- if self.TokenList[-1] != 'END' and self.TokenList[-1] in self.NonEndingOpcode:
+ if self.TokenList[-1] != DEPEX_OPCODE_END and self.TokenList[-1] in self.NonEndingOpcode:
EdkLogger.error("GenDepex", PARSER_ERROR, "Extra %s at the end of the dependency expression" % self.TokenList[-1],
ExtraData=str(self))
- if self.TokenList[-1] == 'END' and self.TokenList[-2] in self.NonEndingOpcode:
+ if self.TokenList[-1] == DEPEX_OPCODE_END and self.TokenList[-2] in self.NonEndingOpcode:
EdkLogger.error("GenDepex", PARSER_ERROR, "Extra %s at the end of the dependency expression" % self.TokenList[-2],
ExtraData=str(self))
- if "END" in self.TokenList and "END" != self.TokenList[-1]:
+ if DEPEX_OPCODE_END in self.TokenList and DEPEX_OPCODE_END != self.TokenList[-1]:
EdkLogger.error("GenDepex", PARSER_ERROR, "Extra expressions after END",
ExtraData=str(self))
## Simply optimize the dependency expression by removing duplicated operands
def Optimize(self):
ValidOpcode = list(set(self.OpcodeList))
- if len(ValidOpcode) != 1 or ValidOpcode[0] not in ['AND', 'OR']:
+ if len(ValidOpcode) != 1 or ValidOpcode[0] not in [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]:
return
Op = ValidOpcode[0]
NewOperand = []
@@ -284,14 +282,14 @@ class DependencyExpression:
if Token in self.SupportedOpcode or Token in NewOperand:
continue
AllOperand.add(Token)
- if Token == 'TRUE':
- if Op == 'AND':
+ if Token == DEPEX_OPCODE_TRUE:
+ if Op == DEPEX_OPCODE_AND:
continue
else:
NewOperand.append(Token)
break
- elif Token == 'FALSE':
- if Op == 'OR':
+ elif Token == DEPEX_OPCODE_FALSE:
+ if Op == DEPEX_OPCODE_OR:
continue
else:
NewOperand.append(Token)
@@ -299,13 +297,13 @@ class DependencyExpression:
NewOperand.append(Token)
# don't generate depex if only TRUE operand left
- if self.ModuleType == 'PEIM' and len(NewOperand) == 1 and NewOperand[0] == 'TRUE':
+ if self.ModuleType == 'PEIM' and len(NewOperand) == 1 and NewOperand[0] == DEPEX_OPCODE_TRUE:
self.PostfixNotation = []
return
# don't generate depex if all operands are architecture protocols
if self.ModuleType in ['UEFI_DRIVER', 'DXE_DRIVER', 'DXE_RUNTIME_DRIVER', 'DXE_SAL_DRIVER', 'DXE_SMM_DRIVER', 'MM_STANDALONE'] and \
- Op == 'AND' and \
+ Op == DEPEX_OPCODE_AND and \
self.ArchProtocols == set([GuidStructureStringToGuidString(Guid) for Guid in AllOperand]):
self.PostfixNotation = []
return
@@ -371,7 +369,7 @@ class DependencyExpression:
versionNumber = ("0.04" + " " + gBUILD_VERSION)
__version__ = "%prog Version " + versionNumber
-__copyright__ = "Copyright (c) 2007-2010, Intel Corporation All rights reserved."
+__copyright__ = "Copyright (c) 2007-2018, Intel Corporation All rights reserved."
__usage__ = "%prog [options] [dependency_expression_file]"
## Parse command line options
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index 8af94354620c..48700ba82012 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -473,6 +473,18 @@ DATABASE_PATH = ":memory:" #"BuildDatabase.db"
# used by ECC
MODIFIER_LIST = ['IN', 'OUT', 'OPTIONAL', 'UNALIGNED', 'EFI_RUNTIMESERVICE', 'EFI_BOOTSERVICE', 'EFIAPI']
+# Dependency Opcodes
+DEPEX_OPCODE_BEFORE = "BEFORE"
+DEPEX_OPCODE_AFTER = "AFTER"
+DEPEX_OPCODE_PUSH = "PUSH"
+DEPEX_OPCODE_AND = "AND"
+DEPEX_OPCODE_OR = "OR"
+DEPEX_OPCODE_NOT = "NOT"
+DEPEX_OPCODE_END = "END"
+DEPEX_OPCODE_SOR = "SOR"
+DEPEX_OPCODE_TRUE = "TRUE"
+DEPEX_OPCODE_FALSE = "FALSE"
+
# Dependency Expression
DEPEX_SUPPORTED_OPCODE = ["BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT", "END", "SOR", "TRUE", "FALSE", '(', ')']
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 34/42] BaseTools: standardize GUID and pack size
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (32 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 33/42] BaseTools: AutoGen - add Opcode constants Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 35/42] BaseTools: remove unused variable Jaben Carsey
` (8 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
currently GUID packing and pack size determination is spread
throughout the code. This introduces a shared function and dict and
routes all code paths through them.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 45 ++---------------
BaseTools/Source/Python/AutoGen/GenVar.py | 25 +---------
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 28 ++---------
BaseTools/Source/Python/Common/DataType.py | 11 +++++
BaseTools/Source/Python/Common/Misc.py | 51 ++++++++++++++------
BaseTools/Source/Python/GenFds/Fv.py | 22 ++-------
BaseTools/Source/Python/build/BuildReport.py | 2 +-
7 files changed, 61 insertions(+), 123 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index ef6647a15302..94f430897b98 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -21,7 +21,6 @@ from Common.VariableAttributes import VariableAttributes
import copy
from struct import unpack
from Common.DataType import *
-from GenVar import PackGUID
DATABASE_VERSION = 7
@@ -290,22 +289,7 @@ class DbItemList:
GuidString = GuidStructureStringToGuidString(GuidStructureValue)
return PackGUID(GuidString.split('-'))
- if self.ItemSize == 8:
- PackStr = "=Q"
- elif self.ItemSize == 4:
- PackStr = "=L"
- elif self.ItemSize == 2:
- PackStr = "=H"
- elif self.ItemSize == 1:
- PackStr = "=B"
- elif self.ItemSize == 0:
- PackStr = "=B"
- elif self.ItemSize == 16:
- # pack Guid
- PackStr = ''
- else:
- # should not reach here
- assert(False)
+ PackStr = PACK_CODE_BY_SIZE[self.ItemSize]
Buffer = ''
for Datas in self.RawDataList:
@@ -379,18 +363,7 @@ class DbComItemList (DbItemList):
return self.ListSize
def PackData(self):
- if self.ItemSize == 8:
- PackStr = "=Q"
- elif self.ItemSize == 4:
- PackStr = "=L"
- elif self.ItemSize == 2:
- PackStr = "=H"
- elif self.ItemSize == 1:
- PackStr = "=B"
- elif self.ItemSize == 0:
- PackStr = "=B"
- else:
- assert(False)
+ PackStr = PACK_CODE_BY_SIZE[self.ItemSize]
Buffer = ''
for DataList in self.RawDataList:
@@ -818,19 +791,7 @@ def BuildExDataBase(Dict):
# Construct the database buffer
Guid = "{0x3c7d193c, 0x682c, 0x4c14, 0xa6, 0x8f, 0x55, 0x2d, 0xea, 0x4f, 0x43, 0x7e}"
Guid = StringArrayToList(Guid)
- Buffer = pack('=LHHBBBBBBBB',
- Guid[0],
- Guid[1],
- Guid[2],
- Guid[3],
- Guid[4],
- Guid[5],
- Guid[6],
- Guid[7],
- Guid[8],
- Guid[9],
- Guid[10],
- )
+ Buffer = PackByteFormatGUID(Guid)
b = pack("=L", DATABASE_VERSION)
Buffer += b
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index e3595bb62315..bc750bd72f37 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -26,22 +26,6 @@ var_info = collections.namedtuple("uefi_var", "pcdindex,pcdname,defaultstoragena
NvStorageHeaderSize = 28
VariableHeaderSize = 32
-def PackGUID(Guid):
- GuidBuffer = pack('=LHHBBBBBBBB',
- int(Guid[0], 16),
- int(Guid[1], 16),
- int(Guid[2], 16),
- int(Guid[3][-4:-2], 16),
- int(Guid[3][-2:], 16),
- int(Guid[4][-12:-10], 16),
- int(Guid[4][-10:-8], 16),
- int(Guid[4][-8:-6], 16),
- int(Guid[4][-6:-4], 16),
- int(Guid[4][-4:-2], 16),
- int(Guid[4][-2:], 16)
- )
- return GuidBuffer
-
class VariableMgr(object):
def __init__(self, DefaultStoreMap,SkuIdMap):
self.VarInfo = []
@@ -87,14 +71,7 @@ class VariableMgr(object):
data_type = item.data_type
value_list = item.default_value.strip("{").strip("}").split(",")
if data_type in DataType.TAB_PCD_NUMERIC_TYPES:
- if data_type == ["BOOLEAN", DataType.TAB_UINT8]:
- data_flag = "=B"
- elif data_type == DataType.TAB_UINT16:
- data_flag = "=H"
- elif data_type == DataType.TAB_UINT32:
- data_flag = "=L"
- elif data_type == DataType.TAB_UINT64:
- data_flag = "=Q"
+ data_flag = PACK_CODE_BY_SIZE[MAX_SIZE_TYPE[data_type]]
data = value_list[0]
value_list = []
for data_byte in pack(data_flag,int(data,16) if data.upper().startswith('0X') else int(data)):
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index f5b1574e4440..3ca113c25669 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -34,13 +34,6 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
self.var_check_info.append(var_check_tab)
def dump(self, dest, Phase):
-
- FormatMap = {}
- FormatMap[1] = "=B"
- FormatMap[2] = "=H"
- FormatMap[4] = "=L"
- FormatMap[8] = "=Q"
-
if not os.path.isabs(dest):
return
if not os.path.exists(dest):
@@ -106,19 +99,8 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
realLength += 4
Guid = var_check_tab.Guid
- b = pack('=LHHBBBBBBBB',
- Guid[0],
- Guid[1],
- Guid[2],
- Guid[3],
- Guid[4],
- Guid[5],
- Guid[6],
- Guid[7],
- Guid[8],
- Guid[9],
- Guid[10],
- )
+ b = PackByteFormatGUID(Guid)
+
Buffer += b
realLength += 16
@@ -156,14 +138,14 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
realLength += 1
for v_data in item.data:
if type(v_data) in (int, long):
- b = pack(FormatMap[item.StorageWidth], v_data)
+ b = pack(PACK_CODE_BY_SIZE[item.StorageWidth], v_data)
Buffer += b
realLength += item.StorageWidth
else:
- b = pack(FormatMap[item.StorageWidth], v_data[0])
+ b = pack(PACK_CODE_BY_SIZE[item.StorageWidth], v_data[0])
Buffer += b
realLength += item.StorageWidth
- b = pack(FormatMap[item.StorageWidth], v_data[1])
+ b = pack(PACK_CODE_BY_SIZE[item.StorageWidth], v_data[1])
Buffer += b
realLength += item.StorageWidth
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index 48700ba82012..c3058d751470 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -530,3 +530,14 @@ SECTIONS_HAVE_ITEM_AFTER_ARCH = [TAB_LIBRARY_CLASSES.upper(), TAB_DEPEX.upper(),
PCDS_DYNAMICEX_HII.upper(),
TAB_BUILD_OPTIONS.upper(),
TAB_INCLUDES.upper()]
+
+#
+# pack codes as used in PcdDb and elsewhere
+#
+PACK_PATTERN_GUID = '=LHHBBBBBBBB'
+PACK_CODE_BY_SIZE = {8:'=Q',
+ 4:'=L',
+ 2:'=H',
+ 1:'=B',
+ 0:'=B',
+ 16:""}
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index f6ebaa60e23f..86c69808422c 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -2087,20 +2087,7 @@ class SkuClass():
# Pack a registry format GUID
#
def PackRegistryFormatGuid(Guid):
- Guid = Guid.split('-')
- return pack('=LHHBBBBBBBB',
- int(Guid[0], 16),
- int(Guid[1], 16),
- int(Guid[2], 16),
- int(Guid[3][-4:-2], 16),
- int(Guid[3][-2:], 16),
- int(Guid[4][-12:-10], 16),
- int(Guid[4][-10:-8], 16),
- int(Guid[4][-8:-6], 16),
- int(Guid[4][-6:-4], 16),
- int(Guid[4][-4:-2], 16),
- int(Guid[4][-2:], 16)
- )
+ return PackGUID(Guid.split('-'))
## Get the integer value from string like "14U" or integer like 2
#
@@ -2126,6 +2113,42 @@ def GetIntegerValue(Input):
else:
return int(String)
+#
+# Pack a GUID (registry format) list into a buffer and return it
+#
+def PackGUID(Guid):
+ return pack(PACK_PATTERN_GUID,
+ int(Guid[0], 16),
+ int(Guid[1], 16),
+ int(Guid[2], 16),
+ int(Guid[3][-4:-2], 16),
+ int(Guid[3][-2:], 16),
+ int(Guid[4][-12:-10], 16),
+ int(Guid[4][-10:-8], 16),
+ int(Guid[4][-8:-6], 16),
+ int(Guid[4][-6:-4], 16),
+ int(Guid[4][-4:-2], 16),
+ int(Guid[4][-2:], 16)
+ )
+
+#
+# Pack a GUID (byte) list into a buffer and return it
+#
+def PackByteFormatGUID(Guid):
+ return pack(PACK_PATTERN_GUID,
+ Guid[0],
+ Guid[1],
+ Guid[2],
+ Guid[3],
+ Guid[4],
+ Guid[5],
+ Guid[6],
+ Guid[7],
+ Guid[8],
+ Guid[9],
+ Guid[10],
+ )
+
##
#
# This acts like the main() function for the script, unless it is 'import'ed into another
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 2e57c5e92365..40e8bcd5aa72 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -26,7 +26,7 @@ import FfsFileStatement
from GenFdsGlobalVariable import GenFdsGlobalVariable
from GenFds import GenFds
from CommonDataClass.FdfClass import FvClassObject
-from Common.Misc import SaveFileOnChange
+from Common.Misc import SaveFileOnChange, PackGUID
from Common.LongFilePathSupport import CopyLongFilePath
from Common.LongFilePathSupport import OpenLongFilePath as open
@@ -366,10 +366,7 @@ class FV (FvClassObject):
# FV UI name
#
Buffer += (pack('HH', (FvUiLen + 16 + 4), 0x0002)
- + pack('=LHHBBBBBBBB', int(Guid[0], 16), int(Guid[1], 16), int(Guid[2], 16),
- int(Guid[3][-4:-2], 16), int(Guid[3][-2:], 16), int(Guid[4][-12:-10], 16),
- int(Guid[4][-10:-8], 16), int(Guid[4][-8:-6], 16), int(Guid[4][-6:-4], 16),
- int(Guid[4][-4:-2], 16), int(Guid[4][-2:], 16))
+ + PackGUID(Guid)
+ self.UiFvName)
for Index in range (0, len(self.FvExtEntryType)):
@@ -403,20 +400,7 @@ class FV (FvClassObject):
Buffer += pack('B', int(ByteList[Index1], 16))
Guid = self.FvNameGuid.split('-')
- Buffer = pack('=LHHBBBBBBBBL',
- int(Guid[0], 16),
- int(Guid[1], 16),
- int(Guid[2], 16),
- int(Guid[3][-4:-2], 16),
- int(Guid[3][-2:], 16),
- int(Guid[4][-12:-10], 16),
- int(Guid[4][-10:-8], 16),
- int(Guid[4][-8:-6], 16),
- int(Guid[4][-6:-4], 16),
- int(Guid[4][-4:-2], 16),
- int(Guid[4][-2:], 16),
- TotalSize
- ) + Buffer
+ Buffer = PackGUID(Guid) + pack('=L', TotalSize) + Buffer
#
# Generate FV extension header file if the total size is not zero
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index bcbe6f89b48b..505bbcd29e0a 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -313,7 +313,7 @@ class DepexParser(object):
Statement = gOpCodeList[struct.unpack("B", OpCode)[0]]
if Statement in ["BEFORE", "AFTER", "PUSH"]:
GuidValue = "%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X" % \
- struct.unpack("=LHHBBBBBBBB", DepexFile.read(16))
+ struct.unpack(PACK_PATTERN_GUID, DepexFile.read(16))
GuidString = self._GuidDb.get(GuidValue, GuidValue)
Statement = "%s %s" % (Statement, GuidString)
DepexStatement.append(Statement)
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 35/42] BaseTools: remove unused variable
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (33 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 34/42] BaseTools: standardize GUID and pack size Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 36/42] BaseTools: GenFds - use existing shared string Jaben Carsey
` (7 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/Common/DataType.py | 1 -
1 file changed, 1 deletion(-)
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index c3058d751470..62cc84b25f59 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -113,7 +113,6 @@ BINARY_FILE_TYPE_UI = 'UI'
BINARY_FILE_TYPE_BIN = 'BIN'
BINARY_FILE_TYPE_FV = 'FV'
-PLATFORM_COMPONENT_TYPE_LIBRARY = 'LIBRARY'
PLATFORM_COMPONENT_TYPE_LIBRARY_CLASS = 'LIBRARY_CLASS'
PLATFORM_COMPONENT_TYPE_MODULE = 'MODULE'
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 36/42] BaseTools: GenFds - use existing shared string
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (34 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 35/42] BaseTools: remove unused variable Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 37/42] BaseTools: missed a copyright update Jaben Carsey
` (6 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/GenFds/Fv.py | 34 ++++++++++----------
1 file changed, 17 insertions(+), 17 deletions(-)
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 40e8bcd5aa72..188282a27cd2 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -29,8 +29,8 @@ from CommonDataClass.FdfClass import FvClassObject
from Common.Misc import SaveFileOnChange, PackGUID
from Common.LongFilePathSupport import CopyLongFilePath
from Common.LongFilePathSupport import OpenLongFilePath as open
+from Common.DataType import TAB_LINE_BREAK
-T_CHAR_LF = '\n'
FV_UI_EXT_ENTY_GUID = 'A67DF1FA-8DE8-4E98-AF09-4BDF2EFFBC7C'
## generate FV
@@ -111,7 +111,7 @@ class FV (FvClassObject):
if not Flag:
self.FvInfFile.writelines("EFI_FILE_NAME = " + \
FileName + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
# Process Modules in FfsList
for FfsFile in self.FfsList :
@@ -125,7 +125,7 @@ class FV (FvClassObject):
if not Flag:
self.FvInfFile.writelines("EFI_FILE_NAME = " + \
FileName + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
if not Flag:
SaveFileOnChange(self.InfFileName, self.FvInfFile.getvalue(), False)
self.FvInfFile.close()
@@ -270,36 +270,36 @@ class FV (FvClassObject):
#
# Add [Options]
#
- self.FvInfFile.writelines("[options]" + T_CHAR_LF)
+ self.FvInfFile.writelines("[options]" + TAB_LINE_BREAK)
if BaseAddress is not None :
self.FvInfFile.writelines("EFI_BASE_ADDRESS = " + \
BaseAddress + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
if BlockSize is not None:
self.FvInfFile.writelines("EFI_BLOCK_SIZE = " + \
'0x%X' %BlockSize + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
if BlockNum is not None:
self.FvInfFile.writelines("EFI_NUM_BLOCKS = " + \
' 0x%X' %BlockNum + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
else:
if self.BlockSizeList == []:
if not self._GetBlockSize():
#set default block size is 1
- self.FvInfFile.writelines("EFI_BLOCK_SIZE = 0x1" + T_CHAR_LF)
+ self.FvInfFile.writelines("EFI_BLOCK_SIZE = 0x1" + TAB_LINE_BREAK)
for BlockSize in self.BlockSizeList :
if BlockSize[0] is not None:
self.FvInfFile.writelines("EFI_BLOCK_SIZE = " + \
'0x%X' %BlockSize[0] + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
if BlockSize[1] is not None:
self.FvInfFile.writelines("EFI_NUM_BLOCKS = " + \
' 0x%X' %BlockSize[1] + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
if self.BsBaseAddress is not None:
self.FvInfFile.writelines('EFI_BOOT_DRIVER_BASE_ADDRESS = ' + \
@@ -310,11 +310,11 @@ class FV (FvClassObject):
#
# Add attribute
#
- self.FvInfFile.writelines("[attributes]" + T_CHAR_LF)
+ self.FvInfFile.writelines("[attributes]" + TAB_LINE_BREAK)
self.FvInfFile.writelines("EFI_ERASE_POLARITY = " + \
' %s' %ErasePloarity + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
if not (self.FvAttributeDict is None):
for FvAttribute in self.FvAttributeDict.keys() :
if FvAttribute == "FvUsedSizeEnable":
@@ -325,12 +325,12 @@ class FV (FvClassObject):
FvAttribute + \
' = ' + \
self.FvAttributeDict[FvAttribute] + \
- T_CHAR_LF )
+ TAB_LINE_BREAK )
if self.FvAlignment is not None:
self.FvInfFile.writelines("EFI_FVB2_ALIGNMENT_" + \
self.FvAlignment.strip() + \
" = TRUE" + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
#
# Generate FV extension header file
@@ -416,14 +416,14 @@ class FV (FvClassObject):
os.remove (self.InfFileName)
self.FvInfFile.writelines("EFI_FV_EXT_HEADER_FILE_NAME = " + \
FvExtHeaderFileName + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
#
# Add [Files]
#
- self.FvInfFile.writelines("[files]" + T_CHAR_LF)
+ self.FvInfFile.writelines("[files]" + TAB_LINE_BREAK)
if VtfDict and self.UiFvName in VtfDict:
self.FvInfFile.writelines("EFI_FILE_NAME = " + \
VtfDict[self.UiFvName] + \
- T_CHAR_LF)
+ TAB_LINE_BREAK)
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 37/42] BaseTools: missed a copyright update
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (35 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 36/42] BaseTools: GenFds - use existing shared string Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 38/42] BaseTools: Remove lists form set construction Jaben Carsey
` (5 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index bf01de35a6e6..9572e2e3bb4c 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -31,7 +31,7 @@ from Common.LongFilePathSupport import OpenLongFilePath as open
# Version and Copyright
__version_number__ = ("0.10" + " " + gBUILD_VERSION)
__version__ = "%prog Version " + __version_number__
-__copyright__ = "Copyright (c) 2008 - 2010, Intel Corporation. All rights reserved."
+__copyright__ = "Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved."
#====================================== Internal Libraries ========================================
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 38/42] BaseTools: Remove lists form set construction
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (36 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 37/42] BaseTools: missed a copyright update Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 39/42] BaseTools: refactor Depex optomization Jaben Carsey
` (4 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
There is no need to make a list to make a set. remove lists
that are only used in constructing sets.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenDepex.py | 7 +++----
BaseTools/Source/Python/Workspace/DscBuildData.py | 10 +++++-----
2 files changed, 8 insertions(+), 9 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 3dcbad5be666..533447efe82e 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -54,7 +54,7 @@ gType2Phase = {
#
class DependencyExpression:
- ArchProtocols = set([
+ ArchProtocols = {
'665e3ff6-46cc-11d4-9a38-0090273fc14d', # 'gEfiBdsArchProtocolGuid'
'26baccb1-6f42-11d4-bce7-0080c73c8881', # 'gEfiCpuArchProtocolGuid'
'26baccb2-6f42-11d4-bce7-0080c73c8881', # 'gEfiMetronomeArchProtocolGuid'
@@ -67,8 +67,7 @@ class DependencyExpression:
'6441f818-6362-4e44-b570-7dba31dd2453', # 'gEfiVariableWriteArchProtocolGuid'
'1e5668e2-8481-11d4-bcf1-0080c73c8881', # 'gEfiVariableArchProtocolGuid'
'665e3ff5-46cc-11d4-9a38-0090273fc14d' # 'gEfiWatchdogTimerArchProtocolGuid'
- ]
- )
+ }
OpcodePriority = {
DEPEX_OPCODE_AND : 1,
@@ -304,7 +303,7 @@ class DependencyExpression:
# don't generate depex if all operands are architecture protocols
if self.ModuleType in ['UEFI_DRIVER', 'DXE_DRIVER', 'DXE_RUNTIME_DRIVER', 'DXE_SAL_DRIVER', 'DXE_SMM_DRIVER', 'MM_STANDALONE'] and \
Op == DEPEX_OPCODE_AND and \
- self.ArchProtocols == set([GuidStructureStringToGuidString(Guid) for Guid in AllOperand]):
+ self.ArchProtocols == set(GuidStructureStringToGuidString(Guid) for Guid in AllOperand):
self.PostfixNotation = []
return
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 48690aa357f7..235392e1e4fb 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -942,7 +942,7 @@ class DscBuildData(PlatformBuildClassObject):
for skuid in pcd.SkuInfoList:
skuobj = pcd.SkuInfoList.get(skuid)
if TAB_DEFAULT_STORES_DEFAULT not in skuobj.DefaultStoreDict:
- PcdDefaultStoreSet = set([defaultstorename for defaultstorename in skuobj.DefaultStoreDict])
+ PcdDefaultStoreSet = set(defaultstorename for defaultstorename in skuobj.DefaultStoreDict)
mindefaultstorename = DefaultStoreMgr.GetMin(PcdDefaultStoreSet)
skuobj.DefaultStoreDict[TAB_DEFAULT_STORES_DEFAULT] = copy.deepcopy(skuobj.DefaultStoreDict[mindefaultstorename])
return Pcds
@@ -1360,7 +1360,7 @@ class DscBuildData(PlatformBuildClassObject):
nextskuid = self.SkuIdMgr.GetNextSkuId(nextskuid)
if NoDefault:
continue
- PcdDefaultStoreSet = set([defaultstorename for defaultstorename in stru_pcd.SkuOverrideValues[nextskuid]])
+ PcdDefaultStoreSet = set(defaultstorename for defaultstorename in stru_pcd.SkuOverrideValues[nextskuid])
mindefaultstorename = DefaultStoreMgr.GetMin(PcdDefaultStoreSet)
for defaultstoreid in DefaultStores:
@@ -1404,7 +1404,7 @@ class DscBuildData(PlatformBuildClassObject):
if str_pcd_obj.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
continue
- PcdDefaultStoreSet = set([defaultstorename for skuobj in str_pcd_obj.SkuInfoList.values() for defaultstorename in skuobj.DefaultStoreDict])
+ PcdDefaultStoreSet = set(defaultstorename for skuobj in str_pcd_obj.SkuInfoList.values() for defaultstorename in skuobj.DefaultStoreDict)
DefaultStoreObj = DefaultStore(self._GetDefaultStores())
mindefaultstorename = DefaultStoreObj.GetMin(PcdDefaultStoreSet)
str_pcd_obj.SkuInfoList[self.SkuIdMgr.SystemSkuId].HiiDefaultValue = str_pcd_obj.SkuInfoList[self.SkuIdMgr.SystemSkuId].DefaultStoreDict[mindefaultstorename]
@@ -2308,7 +2308,7 @@ class DscBuildData(PlatformBuildClassObject):
Pcds = {}
DefaultStoreObj = DefaultStore(self._GetDefaultStores())
SkuIds = {skuname:skuid for skuname,skuid in self.SkuIdMgr.AvailableSkuIdSet.items() if skuname != TAB_COMMON}
- DefaultStores = set([storename for pcdobj in PcdSet.values() for skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStoreDict])
+ DefaultStores = set(storename for pcdobj in PcdSet.values() for skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStoreDict)
for PcdCName, TokenSpaceGuid in PcdSet:
PcdObj = PcdSet[(PcdCName, TokenSpaceGuid)]
self.CopyDscRawValue(PcdObj)
@@ -2498,7 +2498,7 @@ class DscBuildData(PlatformBuildClassObject):
invalidhii = []
for pcdname in Pcds:
pcd = Pcds[pcdname]
- varnameset = set([sku.VariableName for (skuid,sku) in pcd.SkuInfoList.items()])
+ varnameset = set(sku.VariableName for (skuid,sku) in pcd.SkuInfoList.items())
if len(varnameset) > 1:
invalidhii.append(".".join((pcdname[1],pcdname[0])))
if len(invalidhii):
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 39/42] BaseTools: refactor Depex optomization
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (37 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 38/42] BaseTools: Remove lists form set construction Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 40/42] BaseTools: dont make iterator into list if not needed Jaben Carsey
` (3 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
No need to make a list from the set. just pop the item off.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/GenDepex.py | 10 +++++++---
1 file changed, 7 insertions(+), 3 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 533447efe82e..f415eaaa7723 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -271,10 +271,14 @@ class DependencyExpression:
## Simply optimize the dependency expression by removing duplicated operands
def Optimize(self):
- ValidOpcode = list(set(self.OpcodeList))
- if len(ValidOpcode) != 1 or ValidOpcode[0] not in [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]:
+ OpcodeSet = set(self.OpcodeList)
+ # if there are isn't one in the set, return
+ if len(OpcodeSet) != 1:
+ return
+ Op = OpcodeSet.pop()
+ #if Op isn't either OR or AND, return
+ if Op not in [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]:
return
- Op = ValidOpcode[0]
NewOperand = []
AllOperand = set()
for Token in self.PostfixNotation:
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 40/42] BaseTools: dont make iterator into list if not needed
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (38 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 39/42] BaseTools: refactor Depex optomization Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 41/42] BaseTools: create base expression class Jaben Carsey
` (2 subsequent siblings)
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
functions (like join) can use the iterator just as easily.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/AutoGen/AutoGen.py | 10 +++++-----
BaseTools/Source/Python/AutoGen/GenVar.py | 8 ++++----
BaseTools/Source/Python/Common/MigrationUtilities.py | 2 +-
BaseTools/Source/Python/Common/Misc.py | 8 ++++----
BaseTools/Source/Python/Common/String.py | 12 ++++++------
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py | 4 ++--
BaseTools/Source/Python/GenFds/GenFds.py | 2 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 6 +++---
BaseTools/Source/Python/Workspace/InfBuildData.py | 12 ++++++------
BaseTools/Source/Python/Workspace/MetaDataTable.py | 4 ++--
BaseTools/Source/Python/build/build.py | 2 +-
11 files changed, 35 insertions(+), 35 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 39d5932a9a66..6cd54c2e9aa1 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -460,7 +460,7 @@ class WorkspaceAutoGen(AutoGen):
'build',
FORMAT_INVALID,
"Building modules from source INFs, following PCD use %s and %s access method. It must be corrected to use only one access method." % (i, j),
- ExtraData="%s" % '\n\t'.join([str(P[1]+'.'+P[0]) for P in Intersections])
+ ExtraData="%s" % '\n\t'.join(str(P[1]+'.'+P[0]) for P in Intersections)
)
#
@@ -2295,7 +2295,7 @@ class PlatformAutoGen(AutoGen):
#
for Item in LibraryList:
if ConsumedByList[Item] != [] and Item in Constructor and len(Constructor) > 1:
- ErrorMessage = "\tconsumed by " + "\n\tconsumed by ".join([str(L) for L in ConsumedByList[Item]])
+ ErrorMessage = "\tconsumed by " + "\n\tconsumed by ".join(str(L) for L in ConsumedByList[Item])
EdkLogger.error("build", BUILD_ERROR, 'Library [%s] with constructors has a cycle' % str(Item),
ExtraData=ErrorMessage, File=self.MetaFile)
if Item not in SortedLibraryList:
@@ -2415,7 +2415,7 @@ class PlatformAutoGen(AutoGen):
if Sku.VariableGuid == '': continue
Sku.VariableGuidValue = GuidValue(Sku.VariableGuid, self.PackageList, self.MetaFile.Path)
if Sku.VariableGuidValue is None:
- PackageList = "\n\t".join([str(P) for P in self.PackageList])
+ PackageList = "\n\t".join(str(P) for P in self.PackageList)
EdkLogger.error(
'build',
RESOURCE_NOT_AVAILABLE,
@@ -3122,7 +3122,7 @@ class ModuleAutoGen(AutoGen):
for Depex in DepexList:
for key in Depex:
DepexStr += '[Depex.%s.%s]\n' % key
- DepexStr += '\n'.join(['# '+ val for val in Depex[key]])
+ DepexStr += '\n'.join('# '+ val for val in Depex[key])
DepexStr += '\n\n'
if not DepexStr:
return '[Depex.%s]\n' % self.Arch
@@ -3136,7 +3136,7 @@ class ModuleAutoGen(AutoGen):
DepexStr += ' AND '
DepexStr += '('
for D in Depex.values():
- DepexStr += ' '.join([val for val in D])
+ DepexStr += ' '.join(val for val in D)
Index = DepexStr.find('END')
if Index > -1 and Index == len(DepexStr) - 3:
DepexStr = DepexStr[:-3]
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index bc750bd72f37..ffd490520dcc 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -80,7 +80,7 @@ class VariableMgr(object):
try:
newvaluestr = "{" + ",".join(VariableMgr.assemble_variable(newvalue)) +"}"
except:
- EdkLogger.error("build", AUTOGEN_ERROR, "Variable offset conflict in PCDs: %s \n" % (" and ".join([item.pcdname for item in sku_var_info_offset_list])))
+ EdkLogger.error("build", AUTOGEN_ERROR, "Variable offset conflict in PCDs: %s \n" % (" and ".join(item.pcdname for item in sku_var_info_offset_list)))
n = sku_var_info_offset_list[0]
indexedvarinfo[key] = [var_info(n.pcdindex,n.pcdname,n.defaultstoragename,n.skuname,n.var_name, n.var_guid, "0x00",n.var_attribute,newvaluestr , newvaluestr , DataType.TAB_VOID)]
self.VarInfo = [item[0] for item in indexedvarinfo.values()]
@@ -116,9 +116,9 @@ class VariableMgr(object):
default_sku_default = indexedvarinfo[index].get((DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT))
if default_sku_default.data_type not in DataType.TAB_PCD_NUMERIC_TYPES:
- var_max_len = max([len(var_item.default_value.split(",")) for var_item in sku_var_info.values()])
+ var_max_len = max(len(var_item.default_value.split(",")) for var_item in sku_var_info.values())
if len(default_sku_default.default_value.split(",")) < var_max_len:
- tail = ",".join([ "0x00" for i in range(var_max_len-len(default_sku_default.default_value.split(",")))])
+ tail = ",".join("0x00" for i in range(var_max_len-len(default_sku_default.default_value.split(","))))
default_data_buffer = VariableMgr.PACK_VARIABLES_DATA(default_sku_default.default_value,default_sku_default.data_type,tail)
@@ -136,7 +136,7 @@ class VariableMgr(object):
if default_sku_default.data_type not in DataType.TAB_PCD_NUMERIC_TYPES:
if len(other_sku_other.default_value.split(",")) < var_max_len:
- tail = ",".join([ "0x00" for i in range(var_max_len-len(other_sku_other.default_value.split(",")))])
+ tail = ",".join("0x00" for i in range(var_max_len-len(other_sku_other.default_value.split(","))))
others_data_buffer = VariableMgr.PACK_VARIABLES_DATA(other_sku_other.default_value,other_sku_other.data_type,tail)
diff --git a/BaseTools/Source/Python/Common/MigrationUtilities.py b/BaseTools/Source/Python/Common/MigrationUtilities.py
index 27d30a11b529..cc6bb4924dc3 100644
--- a/BaseTools/Source/Python/Common/MigrationUtilities.py
+++ b/BaseTools/Source/Python/Common/MigrationUtilities.py
@@ -506,7 +506,7 @@ def GetTextFileInfo(FileName, TagTuple):
#
def GetXmlFileInfo(FileName, TagTuple):
XmlDom = XmlParseFile(FileName)
- return tuple([XmlElement(XmlDom, XmlTag) for XmlTag in TagTuple])
+ return tuple(XmlElement(XmlDom, XmlTag) for XmlTag in TagTuple)
## Parse migration command line options
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 86c69808422c..f01b4bd05b52 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -833,7 +833,7 @@ class TemplateString(object):
def Append(self, AppendString, Dictionary=None):
if Dictionary:
SectionList = self._Parse(AppendString)
- self.String += "".join([S.Instantiate(Dictionary) for S in SectionList])
+ self.String += "".join(S.Instantiate(Dictionary) for S in SectionList)
else:
self.String += AppendString
@@ -844,7 +844,7 @@ class TemplateString(object):
# @retval str The string replaced with placeholder values
#
def Replace(self, Dictionary=None):
- return "".join([S.Instantiate(Dictionary) for S in self._TemplateSectionList])
+ return "".join(S.Instantiate(Dictionary) for S in self._TemplateSectionList)
## Progress indicator class
#
@@ -1926,7 +1926,7 @@ class DefaultStore():
if not self.DefaultStores or "0" in self.DefaultStores:
return "0",TAB_DEFAULT_STORES_DEFAULT
else:
- minvalue = min([int(value_str) for value_str in self.DefaultStores])
+ minvalue = min(int(value_str) for value_str in self.DefaultStores)
return (str(minvalue), self.DefaultStores[str(minvalue)])
def GetMin(self,DefaultSIdList):
if not DefaultSIdList:
@@ -2023,7 +2023,7 @@ class SkuClass():
skuorderset.append(self.GetSkuChain(skuname))
skuorder = []
- for index in range(max([len(item) for item in skuorderset])):
+ for index in range(max(len(item) for item in skuorderset)):
for subset in skuorderset:
if index > len(subset)-1:
continue
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index 389a3ca51d27..70784bcbbdaa 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -818,27 +818,27 @@ def StringToArray(String):
if isinstance(String, unicode):
if len(unicode) == 0:
return "{0x00,0x00}"
- return "{%s,0x00,0x00}" % ",".join(["0x%02x,0x00" % ord(C) for C in String])
+ return "{%s,0x00,0x00}" % ",".join("0x%02x,0x00" % ord(C) for C in String)
elif String.startswith('L"'):
if String == "L\"\"":
return "{0x00,0x00}"
else:
- return "{%s,0x00,0x00}" % ",".join(["0x%02x,0x00" % ord(C) for C in String[2:-1]])
+ return "{%s,0x00,0x00}" % ",".join("0x%02x,0x00" % ord(C) for C in String[2:-1])
elif String.startswith('"'):
if String == "\"\"":
return "{0x00,0x00}"
else:
StringLen = len(String[1:-1])
if StringLen % 2:
- return "{%s,0x00}" % ",".join(["0x%02x" % ord(C) for C in String[1:-1]])
+ return "{%s,0x00}" % ",".join("0x%02x" % ord(C) for C in String[1:-1])
else:
- return "{%s,0x00,0x00}" % ",".join(["0x%02x" % ord(C) for C in String[1:-1]])
+ return "{%s,0x00,0x00}" % ",".join("0x%02x" % ord(C) for C in String[1:-1])
elif String.startswith('{'):
StringLen = len(String.split(","))
if StringLen % 2:
- return "{%s,0x00}" % ",".join([ C.strip() for C in String[1:-1].split(',')])
+ return "{%s,0x00}" % ",".join(C.strip() for C in String[1:-1].split(','))
else:
- return "{%s}" % ",".join([ C.strip() for C in String[1:-1].split(',')])
+ return "{%s}" % ",".join(C.strip() for C in String[1:-1].split(','))
else:
if len(String.split()) % 2:
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
index 6b980150f53e..760f88cc7294 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
@@ -1,7 +1,7 @@
## @file
# This file is used to create/update/query/erase table for files
#
-# Copyright (c) 2008 - 2014, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
# This program and the accompanying materials
# are licensed and made available under the terms and conditions of the BSD License
# which accompanies this distribution. The full text of the license may be found at
@@ -73,7 +73,7 @@ class Table(object):
self.ID = self.ID + self._ID_STEP_
if self.ID >= (self.IdBase + self._ID_MAX_):
self.ID = self.IdBase + self._ID_STEP_
- Values = ", ".join([str(Arg) for Arg in Args])
+ Values = ", ".join(str(Arg) for Arg in Args)
SqlCommand = "insert into %s values(%s, %s)" % (self.Table, self.ID, Values)
EdkLogger.debug(EdkLogger.DEBUG_5, SqlCommand)
self.Cur.execute(SqlCommand)
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 4b8c7913d2db..7e80575dd794 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -761,7 +761,7 @@ class GenFds :
length = F.tell()
F.seek(4)
TmpStr = unpack('%dh' % ((length - 4) / 2), F.read())
- Name = ''.join([chr(c) for c in TmpStr[:-1]])
+ Name = ''.join(chr(c) for c in TmpStr[:-1])
else:
FileList = []
if 'fv.sec.txt' in MatchDict:
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 235392e1e4fb..3d3db61d851b 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1514,7 +1514,7 @@ class DscBuildData(PlatformBuildClassObject):
return len(Value) - 2
return len(Value)
- return str(max([pcd_size for pcd_size in [get_length(item) for item in sku_values]]))
+ return str(max(get_length(item) for item in sku_values))
@staticmethod
def ExecuteCommand (Command):
@@ -2078,7 +2078,7 @@ class DscBuildData(PlatformBuildClassObject):
SearchPathList = []
SearchPathList.append(os.path.normpath(mws.join(GlobalData.gWorkspace, "BaseTools/Source/C/Include")))
SearchPathList.append(os.path.normpath(mws.join(GlobalData.gWorkspace, "BaseTools/Source/C/Common")))
- SearchPathList.extend([str(item) for item in IncSearchList])
+ SearchPathList.extend(str(item) for item in IncSearchList)
IncFileList = GetDependencyList(IncludeFileFullPaths,SearchPathList)
for include_file in IncFileList:
MakeApp += "$(OBJECTS) : %s\n" % include_file
@@ -2324,7 +2324,7 @@ class DscBuildData(PlatformBuildClassObject):
if PcdType in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
for skuid in PcdObj.SkuInfoList:
skuobj = PcdObj.SkuInfoList[skuid]
- mindefaultstorename = DefaultStoreObj.GetMin(set([defaultstorename for defaultstorename in skuobj.DefaultStoreDict]))
+ mindefaultstorename = DefaultStoreObj.GetMin(set(defaultstorename for defaultstorename in skuobj.DefaultStoreDict))
for defaultstorename in DefaultStores:
if defaultstorename not in skuobj.DefaultStoreDict:
skuobj.DefaultStoreDict[defaultstorename] = copy.deepcopy(skuobj.DefaultStoreDict[mindefaultstorename])
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index a725a2a2a772..d21f38bc7f46 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -718,7 +718,7 @@ class InfBuildData(ModuleBuildClassObject):
CName = Record[0]
Value = ProtocolValue(CName, self.Packages, self.MetaFile.Path)
if Value is None:
- PackageList = "\n\t".join([str(P) for P in self.Packages])
+ PackageList = "\n\t".join(str(P) for P in self.Packages)
EdkLogger.error('build', RESOURCE_NOT_AVAILABLE,
"Value of Protocol [%s] is not found under [Protocols] section in" % CName,
ExtraData=PackageList, File=self.MetaFile, Line=Record[-1])
@@ -743,7 +743,7 @@ class InfBuildData(ModuleBuildClassObject):
CName = Record[0]
Value = PpiValue(CName, self.Packages, self.MetaFile.Path)
if Value is None:
- PackageList = "\n\t".join([str(P) for P in self.Packages])
+ PackageList = "\n\t".join(str(P) for P in self.Packages)
EdkLogger.error('build', RESOURCE_NOT_AVAILABLE,
"Value of PPI [%s] is not found under [Ppis] section in " % CName,
ExtraData=PackageList, File=self.MetaFile, Line=Record[-1])
@@ -768,7 +768,7 @@ class InfBuildData(ModuleBuildClassObject):
CName = Record[0]
Value = GuidValue(CName, self.Packages, self.MetaFile.Path)
if Value is None:
- PackageList = "\n\t".join([str(P) for P in self.Packages])
+ PackageList = "\n\t".join(str(P) for P in self.Packages)
EdkLogger.error('build', RESOURCE_NOT_AVAILABLE,
"Value of Guid [%s] is not found under [Guids] section in" % CName,
ExtraData=PackageList, File=self.MetaFile, Line=Record[-1])
@@ -938,7 +938,7 @@ class InfBuildData(ModuleBuildClassObject):
if Value is None:
Value = GuidValue(Token, self.Packages, self.MetaFile.Path)
if Value is None:
- PackageList = "\n\t".join([str(P) for P in self.Packages])
+ PackageList = "\n\t".join(str(P) for P in self.Packages)
EdkLogger.error('build', RESOURCE_NOT_AVAILABLE,
"Value of [%s] is not found in" % Token,
ExtraData=PackageList, File=self.MetaFile, Line=Record[-1])
@@ -981,7 +981,7 @@ class InfBuildData(ModuleBuildClassObject):
if TokenSpaceGuid not in self.Guids:
Value = GuidValue(TokenSpaceGuid, self.Packages, self.MetaFile.Path)
if Value is None:
- PackageList = "\n\t".join([str(P) for P in self.Packages])
+ PackageList = "\n\t".join(str(P) for P in self.Packages)
EdkLogger.error('build', RESOURCE_NOT_AVAILABLE,
"Value of Guid [%s] is not found under [Guids] section in" % TokenSpaceGuid,
ExtraData=PackageList, File=self.MetaFile, Line=LineNo)
@@ -1151,7 +1151,7 @@ class InfBuildData(ModuleBuildClassObject):
FORMAT_INVALID,
"PCD [%s.%s] in [%s] is not found in dependent packages:" % (TokenSpaceGuid, PcdRealName, self.MetaFile),
File=self.MetaFile, Line=LineNo,
- ExtraData="\t%s" % '\n\t'.join([str(P) for P in self.Packages])
+ ExtraData="\t%s" % '\n\t'.join(str(P) for P in self.Packages)
)
Pcds[PcdCName, TokenSpaceGuid] = Pcd
diff --git a/BaseTools/Source/Python/Workspace/MetaDataTable.py b/BaseTools/Source/Python/Workspace/MetaDataTable.py
index 0cfec9023261..e37a10c82f8f 100644
--- a/BaseTools/Source/Python/Workspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaDataTable.py
@@ -1,7 +1,7 @@
## @file
# This file is used to create/update/query/erase table for files
#
-# Copyright (c) 2008 - 2014, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
# This program and the accompanying materials
# are licensed and made available under the terms and conditions of the BSD License
# which accompanies this distribution. The full text of the license may be found at
@@ -73,7 +73,7 @@ class Table(object):
self.ID = self.ID + self._ID_STEP_
if self.ID >= (self.IdBase + self._ID_MAX_):
self.ID = self.IdBase + self._ID_STEP_
- Values = ", ".join([str(Arg) for Arg in Args])
+ Values = ", ".join(str(Arg) for Arg in Args)
SqlCommand = "insert into %s values(%s, %s)" % (self.Table, self.ID, Values)
EdkLogger.debug(EdkLogger.DEBUG_5, SqlCommand)
self.Cur.execute(SqlCommand)
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 0ca78c1fa451..6b01617c5c5f 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -545,7 +545,7 @@ class BuildTask:
# while not BuildTask._ErrorFlag.isSet() and \
while len(BuildTask._RunningQueue) > 0:
EdkLogger.verbose("Waiting for thread ending...(%d)" % len(BuildTask._RunningQueue))
- EdkLogger.debug(EdkLogger.DEBUG_8, "Threads [%s]" % ", ".join([Th.getName() for Th in threading.enumerate()]))
+ EdkLogger.debug(EdkLogger.DEBUG_8, "Threads [%s]" % ", ".join(Th.getName() for Th in threading.enumerate()))
# avoid tense loop
time.sleep(0.1)
except BaseException, X:
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 41/42] BaseTools: create base expression class
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (39 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 40/42] BaseTools: dont make iterator into list if not needed Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 42/42] BaseTools: use set instead of list Jaben Carsey
2018-05-04 4:33 ` [PATCH v1 00/42] BaseTools: refactoring patches Zhu, Yonghong
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
this class has a fucntion to share between Exception and RangeExpression
change both classes to call this function init in their init
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/Common/Expression.py | 29 ++++++++++++--------
BaseTools/Source/Python/Common/RangeExpression.py | 27 ++----------------
2 files changed, 20 insertions(+), 36 deletions(-)
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 53504c110d6d..b5f76d860773 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -201,7 +201,22 @@ def IntToStr(Value):
SupportedInMacroList = ['TARGET', 'TOOL_CHAIN_TAG', 'ARCH', 'FAMILY']
-class ValueExpression(object):
+class BaseExpression(object):
+ def __init__(self, *args, **kwargs):
+ super(BaseExpression, self).__init__()
+
+ # Check if current token matches the operators given from parameter
+ def _IsOperator(self, OpSet):
+ Idx = self._Idx
+ self._GetOperator()
+ if self._Token in OpSet:
+ if self._Token in self.LogicalOperators:
+ self._Token = self.LogicalOperators[self._Token]
+ return True
+ self._Idx = Idx
+ return False
+
+class ValueExpression(BaseExpression):
# Logical operator mapping
LogicalOperators = {
'&&' : 'and', '||' : 'or',
@@ -307,6 +322,7 @@ class ValueExpression(object):
return Val
def __init__(self, Expression, SymbolTable={}):
+ super(ValueExpression, self).__init__(self, Expression, SymbolTable)
self._NoProcess = False
if type(Expression) != type(''):
self._Expr = Expression
@@ -780,17 +796,6 @@ class ValueExpression(object):
self._Token = OpToken
return OpToken
- # Check if current token matches the operators given from OpList
- def _IsOperator(self, OpList):
- Idx = self._Idx
- self._GetOperator()
- if self._Token in OpList:
- if self._Token in self.LogicalOperators:
- self._Token = self.LogicalOperators[self._Token]
- return True
- self._Idx = Idx
- return False
-
class ValueExpressionEx(ValueExpression):
def __init__(self, PcdValue, PcdType, SymbolTable={}):
ValueExpression.__init__(self, PcdValue, SymbolTable)
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 35b35e4893bc..a37d0ca70d87 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -16,7 +16,7 @@ from Common.GlobalData import *
from CommonDataClass.Exceptions import BadExpression
from CommonDataClass.Exceptions import WrnExpression
import uuid
-from Common.Expression import PcdPattern
+from Common.Expression import PcdPattern,BaseExpression
from Common.DataType import *
ERR_STRING_EXPR = 'This operator cannot be used in string expression: [%s].'
@@ -186,7 +186,7 @@ def GetOperatorObject(Operator):
else:
raise BadExpression("Bad Operator")
-class RangeExpression(object):
+class RangeExpression(BaseExpression):
# Logical operator mapping
LogicalOperators = {
'&&' : 'and', '||' : 'or',
@@ -347,6 +347,7 @@ class RangeExpression(object):
def __init__(self, Expression, PcdDataType, SymbolTable = {}):
+ super(RangeExpression, self).__init__(self, Expression, PcdDataType, SymbolTable)
self._NoProcess = False
if type(Expression) != type(''):
self._Expr = Expression
@@ -693,25 +694,3 @@ class RangeExpression(object):
raise BadExpression(ERR_OPERATOR_UNSUPPORT % OpToken)
self._Token = OpToken
return OpToken
-
- # Check if current token matches the operators given from OpList
- def _IsOperator(self, OpList):
- Idx = self._Idx
- self._GetOperator()
- if self._Token in OpList:
- if self._Token in self.LogicalOperators:
- self._Token = self.LogicalOperators[self._Token]
- return True
- self._Idx = Idx
- return False
-
-
-
-
-
-
-
-
-
-
-# UTRangeList()
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* [PATCH v1 42/42] BaseTools: use set instead of list
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (40 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 41/42] BaseTools: create base expression class Jaben Carsey
@ 2018-04-27 22:32 ` Jaben Carsey
2018-05-04 4:33 ` [PATCH v1 00/42] BaseTools: refactoring patches Zhu, Yonghong
42 siblings, 0 replies; 44+ messages in thread
From: Jaben Carsey @ 2018-04-27 22:32 UTC (permalink / raw)
To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu
as we only do membership (in) testing for this, set is better
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
BaseTools/Source/Python/Common/Expression.py | 68 ++++++++++----------
BaseTools/Source/Python/Common/RangeExpression.py | 14 ++--
2 files changed, 41 insertions(+), 41 deletions(-)
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index b5f76d860773..de691ccaa612 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -171,7 +171,7 @@ def ReplaceExprMacro(String, Macros, ExceptionList = None):
RetStr += '0'
elif not InQuote:
Tklst = RetStr.split()
- if Tklst and Tklst[-1] in ['IN', 'in'] and ExceptionList and Macro not in ExceptionList:
+ if Tklst and Tklst[-1] in {'IN', 'in'} and ExceptionList and Macro not in ExceptionList:
raise BadExpression(ERR_IN_OPERAND)
# Make sure the macro in exception list is encapsulated by double quote
# For example: DEFINE ARCH = IA32 X64
@@ -243,10 +243,10 @@ class ValueExpression(BaseExpression):
def Eval(Operator, Oprand1, Oprand2 = None):
WrnExp = None
- if Operator not in ["==", "!=", ">=", "<=", ">", "<", "in", "not in"] and \
+ if Operator not in {"==", "!=", ">=", "<=", ">", "<", "in", "not in"} and \
(type(Oprand1) == type('') or type(Oprand2) == type('')):
raise BadExpression(ERR_STRING_EXPR % Operator)
- if Operator in ['in', 'not in']:
+ if Operator in {'in', 'not in'}:
if type(Oprand1) != type(''):
Oprand1 = IntToStr(Oprand1)
if type(Oprand2) != type(''):
@@ -259,19 +259,19 @@ class ValueExpression(BaseExpression):
}
EvalStr = ''
- if Operator in ["!", "NOT", "not"]:
+ if Operator in {"!", "NOT", "not"}:
if type(Oprand1) == type(''):
raise BadExpression(ERR_STRING_EXPR % Operator)
EvalStr = 'not Oprand1'
- elif Operator in ["~"]:
+ elif Operator in {"~"}:
if type(Oprand1) == type(''):
raise BadExpression(ERR_STRING_EXPR % Operator)
EvalStr = '~ Oprand1'
else:
- if Operator in ["+", "-"] and (type(True) in [type(Oprand1), type(Oprand2)]):
+ if Operator in {"+", "-"} and (type(True) in {type(Oprand1), type(Oprand2)}):
# Boolean in '+'/'-' will be evaluated but raise warning
WrnExp = WrnExpression(WRN_BOOL_EXPR)
- elif type('') in [type(Oprand1), type(Oprand2)] and type(Oprand1)!= type(Oprand2):
+ elif type('') in {type(Oprand1), type(Oprand2)} and type(Oprand1)!= type(Oprand2):
# == between string and number/boolean will always return False, != return True
if Operator == "==":
WrnExp = WrnExpression(WRN_EQCMP_STR_OTHERS)
@@ -284,10 +284,10 @@ class ValueExpression(BaseExpression):
else:
raise BadExpression(ERR_RELCMP_STR_OTHERS % Operator)
elif TypeDict[type(Oprand1)] != TypeDict[type(Oprand2)]:
- if Operator in ["==", "!=", ">=", "<=", ">", "<"] and set((TypeDict[type(Oprand1)], TypeDict[type(Oprand2)])) == set((TypeDict[type(True)], TypeDict[type(0)])):
+ if Operator in {"==", "!=", ">=", "<=", ">", "<"} and set((TypeDict[type(Oprand1)], TypeDict[type(Oprand2)])) == set((TypeDict[type(True)], TypeDict[type(0)])):
# comparison between number and boolean is allowed
pass
- elif Operator in ['&', '|', '^', "and", "or"] and set((TypeDict[type(Oprand1)], TypeDict[type(Oprand2)])) == set((TypeDict[type(True)], TypeDict[type(0)])):
+ elif Operator in {'&', '|', '^', "and", "or"} and set((TypeDict[type(Oprand1)], TypeDict[type(Oprand2)])) == set((TypeDict[type(True)], TypeDict[type(0)])):
# bitwise and logical operation between number and boolean is allowed
pass
else:
@@ -310,7 +310,7 @@ class ValueExpression(BaseExpression):
except Exception, Excpt:
raise BadExpression(str(Excpt))
- if Operator in ['and', 'or']:
+ if Operator in {'and', 'or'}:
if Val:
Val = True
else:
@@ -410,13 +410,13 @@ class ValueExpression(BaseExpression):
# Template function to parse binary operators which have same precedence
# Expr [Operator Expr]*
- def _ExprFuncTemplate(self, EvalFunc, OpLst):
+ def _ExprFuncTemplate(self, EvalFunc, OpSet):
Val = EvalFunc()
- while self._IsOperator(OpLst):
+ while self._IsOperator(OpSet):
Op = self._Token
if Op == '?':
Val2 = EvalFunc()
- if self._IsOperator(':'):
+ if self._IsOperator({':'}):
Val3 = EvalFunc()
if Val:
Val = Val2
@@ -431,35 +431,35 @@ class ValueExpression(BaseExpression):
return Val
# A [? B]*
def _ConExpr(self):
- return self._ExprFuncTemplate(self._OrExpr, ['?', ':'])
+ return self._ExprFuncTemplate(self._OrExpr, {'?', ':'})
# A [|| B]*
def _OrExpr(self):
- return self._ExprFuncTemplate(self._AndExpr, ["OR", "or", "||"])
+ return self._ExprFuncTemplate(self._AndExpr, {"OR", "or", "||"})
# A [&& B]*
def _AndExpr(self):
- return self._ExprFuncTemplate(self._BitOr, ["AND", "and", "&&"])
+ return self._ExprFuncTemplate(self._BitOr, {"AND", "and", "&&"})
# A [ | B]*
def _BitOr(self):
- return self._ExprFuncTemplate(self._BitXor, ["|"])
+ return self._ExprFuncTemplate(self._BitXor, {"|"})
# A [ ^ B]*
def _BitXor(self):
- return self._ExprFuncTemplate(self._BitAnd, ["XOR", "xor", "^"])
+ return self._ExprFuncTemplate(self._BitAnd, {"XOR", "xor", "^"})
# A [ & B]*
def _BitAnd(self):
- return self._ExprFuncTemplate(self._EqExpr, ["&"])
+ return self._ExprFuncTemplate(self._EqExpr, {"&"})
# A [ == B]*
def _EqExpr(self):
Val = self._RelExpr()
- while self._IsOperator(["==", "!=", "EQ", "NE", "IN", "in", "!", "NOT", "not"]):
+ while self._IsOperator({"==", "!=", "EQ", "NE", "IN", "in", "!", "NOT", "not"}):
Op = self._Token
- if Op in ["!", "NOT", "not"]:
- if not self._IsOperator(["IN", "in"]):
+ if Op in {"!", "NOT", "not"}:
+ if not self._IsOperator({"IN", "in"}):
raise BadExpression(ERR_REL_NOT_IN)
Op += ' ' + self._Token
try:
@@ -471,29 +471,29 @@ class ValueExpression(BaseExpression):
# A [ > B]*
def _RelExpr(self):
- return self._ExprFuncTemplate(self._ShiftExpr, ["<=", ">=", "<", ">", "LE", "GE", "LT", "GT"])
+ return self._ExprFuncTemplate(self._ShiftExpr, {"<=", ">=", "<", ">", "LE", "GE", "LT", "GT"})
def _ShiftExpr(self):
- return self._ExprFuncTemplate(self._AddExpr, ["<<", ">>"])
+ return self._ExprFuncTemplate(self._AddExpr, {"<<", ">>"})
# A [ + B]*
def _AddExpr(self):
- return self._ExprFuncTemplate(self._MulExpr, ["+", "-"])
+ return self._ExprFuncTemplate(self._MulExpr, {"+", "-"})
# A [ * B]*
def _MulExpr(self):
- return self._ExprFuncTemplate(self._UnaryExpr, ["*", "/", "%"])
+ return self._ExprFuncTemplate(self._UnaryExpr, {"*", "/", "%"})
# [!]*A
def _UnaryExpr(self):
- if self._IsOperator(["!", "NOT", "not"]):
+ if self._IsOperator({"!", "NOT", "not"}):
Val = self._UnaryExpr()
try:
return self.Eval('not', Val)
except WrnExpression, Warn:
self._WarnExcept = Warn
return Warn.result
- if self._IsOperator(["~"]):
+ if self._IsOperator({"~"}):
Val = self._UnaryExpr()
try:
return self.Eval('~', Val)
@@ -531,7 +531,7 @@ class ValueExpression(BaseExpression):
if self._Token.startswith('"') or self._Token.startswith('L"'):
Flag = 0
for Index in range(len(self._Token)):
- if self._Token[Index] in ['"']:
+ if self._Token[Index] in {'"'}:
if self._Token[Index - 1] == '\\':
continue
Flag += 1
@@ -540,7 +540,7 @@ class ValueExpression(BaseExpression):
if self._Token.startswith("'") or self._Token.startswith("L'"):
Flag = 0
for Index in range(len(self._Token)):
- if self._Token[Index] in ["'"]:
+ if self._Token[Index] in {"'"}:
if self._Token[Index - 1] == '\\':
continue
Flag += 1
@@ -645,9 +645,9 @@ class ValueExpression(BaseExpression):
if self._Token.startswith('"'):
self._Token = self._Token[1:-1]
- elif self._Token in ["FALSE", "false", "False"]:
+ elif self._Token in {"FALSE", "false", "False"}:
self._Token = False
- elif self._Token in ["TRUE", "true", "True"]:
+ elif self._Token in {"TRUE", "true", "True"}:
self._Token = True
else:
self.__IsNumberToken()
@@ -841,7 +841,7 @@ class ValueExpressionEx(ValueExpression):
elif Item.startswith(TAB_UINT64):
ItemSize = 8
ValueType = TAB_UINT64
- elif Item[0] in ['"',"'",'L']:
+ elif Item[0] in {'"',"'",'L'}:
ItemSize = 0
ValueType = TAB_VOID
else:
@@ -998,7 +998,7 @@ class ValueExpressionEx(ValueExpression):
Item = '0x%x' % TmpValue if type(TmpValue) != type('') else TmpValue
if ItemSize == 0:
ItemValue, ItemSize = ParseFieldValue(Item)
- if Item[0] not in ['"','L','{'] and ItemSize > 1:
+ if Item[0] not in {'"','L','{'} and ItemSize > 1:
raise BadExpression("Byte array number %s should less than 0xFF." % Item)
else:
ItemValue = ParseFieldValue(Item)[0]
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index a37d0ca70d87..53bb11b8cc9b 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -416,9 +416,9 @@ class RangeExpression(BaseExpression):
# Template function to parse binary operators which have same precedence
# Expr [Operator Expr]*
- def _ExprFuncTemplate(self, EvalFunc, OpLst):
+ def _ExprFuncTemplate(self, EvalFunc, OpSet):
Val = EvalFunc()
- while self._IsOperator(OpLst):
+ while self._IsOperator(OpSet):
Op = self._Token
try:
Val = self.Eval(Op, Val, EvalFunc())
@@ -429,18 +429,18 @@ class RangeExpression(BaseExpression):
# A [|| B]*
def _OrExpr(self):
- return self._ExprFuncTemplate(self._AndExpr, ["OR", "or"])
+ return self._ExprFuncTemplate(self._AndExpr, {"OR", "or"})
# A [&& B]*
def _AndExpr(self):
- return self._ExprFuncTemplate(self._NeExpr, ["AND", "and"])
+ return self._ExprFuncTemplate(self._NeExpr, {"AND", "and"})
def _NeExpr(self):
Val = self._RelExpr()
- while self._IsOperator([ "!=", "NOT", "not"]):
+ while self._IsOperator({"!=", "NOT", "not"}):
Op = self._Token
if Op in ["!", "NOT", "not"]:
- if not self._IsOperator(["IN", "in"]):
+ if not self._IsOperator({"IN", "in"}):
raise BadExpression(ERR_REL_NOT_IN)
Op += ' ' + self._Token
try:
@@ -452,7 +452,7 @@ class RangeExpression(BaseExpression):
# [!]*A
def _RelExpr(self):
- if self._IsOperator(["NOT" , "LE", "GE", "LT", "GT", "EQ", "XOR"]):
+ if self._IsOperator({"NOT" , "LE", "GE", "LT", "GT", "EQ", "XOR"}):
Token = self._Token
Val = self._NeExpr()
try:
--
2.16.2.windows.1
^ permalink raw reply related [flat|nested] 44+ messages in thread
* Re: [PATCH v1 00/42] BaseTools: refactoring patches
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
` (41 preceding siblings ...)
2018-04-27 22:32 ` [PATCH v1 42/42] BaseTools: use set instead of list Jaben Carsey
@ 2018-05-04 4:33 ` Zhu, Yonghong
42 siblings, 0 replies; 44+ messages in thread
From: Zhu, Yonghong @ 2018-05-04 4:33 UTC (permalink / raw)
To: Carsey, Jaben, edk2-devel@lists.01.org
This series patch are good to me.
Only one comment for Patch 34, it should use DataType.PACK_CODE_BY_SIZE but not directly use PACK_CODE_BY_SIZE because it use " import Common.DataType as DataType" in the header. I will correct it when push this patch.
Reviewed-by: Yonghong Zhu <yonghong.zhu@intel.com>
Best Regards,
Zhu Yonghong
-----Original Message-----
From: edk2-devel [mailto:edk2-devel-bounces@lists.01.org] On Behalf Of Jaben Carsey
Sent: Saturday, April 28, 2018 6:32 AM
To: edk2-devel@lists.01.org
Subject: [edk2] [PATCH v1 00/42] BaseTools: refactoring patches
first goal in this series is reduction in meaningless memory allocation or use. An example is creating lists from iterators for the sole purpose of passing into anthoer function where the function would take the iterator.
Another example is making a list just to create a set.
second goal is begining of organizational changes. This includes moving functions from one class to another if the function operates primarily on the second class' data. Another example is if a class has a small function only called in __init__, the logic can just be added to __init__.
imnportant note: one patch removes lots of trailing whitepsace, without making any other changes.
Jaben Carsey (42):
BaseTools: FdfParser - update to remove duplicate constant value
BaseTools: AutoGen - update to remove duplicate constant value
BaseTools: check before accessing members in __eq__
BaseTools: this function has no purpose.
BaseTools: AutoGen - refactor assemble_variable
BaseTools: AutoGen - refactor dictionary access
BaseTools: AutoGen - GenVar refactor static methods
BaseTools: AutoGen - share StripComments API
BaseTools: AutoGen - refactor class factory
BaseTools: Eot - remove unused lists
BaseTools: Eot - refactor global data
BaseTools: AutoGen - remove global line
BaseTools: AutoGen - UniClassObject refactor static methods
BaseTools: refactor to use list not dict
BaseTools: eliminate {} from dictionary contructor call
BaseTools: remove Compound statements
BaseTools: Workspace - refactor a dict
BaseTools: move PCD size calculation functions to PcdClassObject
BaseTools: AutoGen - refactor out functions only called in __init__
BaseTools: AutoGen - refactor out a list
BaseTools: AutoGen - refactor out a useless class
BaseTools: AutoGen - no need to recompute
BaseTools: refactor __init__ functions to not compute temporary
variable
BaseTools: AutoGen - remove function no one calls
BaseTools: AutoGen - move function to clean file namespace
BaseTools: AutoGen - remove another function no one calls
BaseTools: Refactor to share GUID packing function
BaseTools: AutoGen - refactor function to remove extra variables
BaseTools: AutoGen - refactor more functions only called in __init__
BaseTools: remove unused member variable
BaseTools: remove redundant content in InfSectionParser
BaseTools: trim whitespace
BaseTools: AutoGen - add Opcode constants
BaseTools: standardize GUID and pack size
BaseTools: remove unused variable
BaseTools: GenFds - use existing shared string
BaseTools: missed a copyright update
BaseTools: Remove lists form set construction
BaseTools: refactor Depex optomization
BaseTools: dont make iterator into list if not needed
BaseTools: create base expression class
BaseTools: use set instead of list
BaseTools/Source/Python/AutoGen/AutoGen.py | 200 +--
BaseTools/Source/Python/AutoGen/BuildEngine.py | 25 +-
BaseTools/Source/Python/AutoGen/GenC.py | 111 +-
BaseTools/Source/Python/AutoGen/GenDepex.py | 127 +-
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 333 ++---
BaseTools/Source/Python/AutoGen/GenVar.py | 124 +-
BaseTools/Source/Python/AutoGen/IdfClassObject.py | 113 +-
BaseTools/Source/Python/AutoGen/InfSectionParser.py | 21 +-
BaseTools/Source/Python/AutoGen/StrGather.py | 26 +-
BaseTools/Source/Python/AutoGen/UniClassObject.py | 61 +-
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 141 +-
BaseTools/Source/Python/BPDG/BPDG.py | 56 +-
BaseTools/Source/Python/BPDG/GenVpd.py | 132 +-
BaseTools/Source/Python/BPDG/StringTable.py | 10 +-
BaseTools/Source/Python/Common/BuildVersion.py | 6 +-
BaseTools/Source/Python/Common/DataType.py | 26 +-
BaseTools/Source/Python/Common/Database.py | 17 +-
BaseTools/Source/Python/Common/Expression.py | 97 +-
BaseTools/Source/Python/Common/MigrationUtilities.py | 66 +-
BaseTools/Source/Python/Common/Misc.py | 109 +-
BaseTools/Source/Python/Common/MultipleWorkspace.py | 17 +-
BaseTools/Source/Python/Common/RangeExpression.py | 159 +--
BaseTools/Source/Python/Common/String.py | 14 +-
BaseTools/Source/Python/Common/ToolDefClassObject.py | 5 +-
BaseTools/Source/Python/Common/VariableAttributes.py | 12 +-
BaseTools/Source/Python/Common/VpdInfoFile.py | 84 +-
BaseTools/Source/Python/CommonDataClass/FdfClass.py | 28 +-
BaseTools/Source/Python/Ecc/CLexer.py | 8 +-
BaseTools/Source/Python/Ecc/CParser.py | 1468 ++++++++++----------
BaseTools/Source/Python/Ecc/Check.py | 22 +-
BaseTools/Source/Python/Ecc/CodeFragment.py | 3 +-
BaseTools/Source/Python/Ecc/CodeFragmentCollector.py | 124 +-
BaseTools/Source/Python/Ecc/Configuration.py | 10 +-
BaseTools/Source/Python/Ecc/Ecc.py | 26 +-
BaseTools/Source/Python/Ecc/Exception.py | 14 +-
BaseTools/Source/Python/Ecc/FileProfile.py | 5 +-
BaseTools/Source/Python/Ecc/MetaDataParser.py | 46 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py | 4 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 100 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py | 88 +-
BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 4 +-
BaseTools/Source/Python/Ecc/Xml/__init__.py | 6 +-
BaseTools/Source/Python/Ecc/c.py | 12 +-
BaseTools/Source/Python/Eot/CLexer.py | 8 +-
BaseTools/Source/Python/Eot/CParser.py | 1468 ++++++++++----------
BaseTools/Source/Python/Eot/Eot.py | 21 +-
BaseTools/Source/Python/Eot/EotGlobalData.py | 41 -
BaseTools/Source/Python/Eot/Report.py | 4 +-
BaseTools/Source/Python/GenFds/Capsule.py | 2 +-
BaseTools/Source/Python/GenFds/CapsuleData.py | 18 +-
BaseTools/Source/Python/GenFds/EfiSection.py | 8 +-
BaseTools/Source/Python/GenFds/Fd.py | 2 +-
BaseTools/Source/Python/GenFds/FdfParser.py | 173 ++-
BaseTools/Source/Python/GenFds/Ffs.py | 10 +-
BaseTools/Source/Python/GenFds/FfsFileStatement.py | 4 +-
BaseTools/Source/Python/GenFds/FfsInfStatement.py | 62 +-
BaseTools/Source/Python/GenFds/Fv.py | 70 +-
BaseTools/Source/Python/GenFds/GenFds.py | 32 +-
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 36 +-
BaseTools/Source/Python/GenFds/GuidSection.py | 2 +-
BaseTools/Source/Python/GenFds/OptRomFileStatement.py | 6 +-
BaseTools/Source/Python/GenFds/OptRomInfStatement.py | 21 +-
BaseTools/Source/Python/GenFds/OptionRom.py | 49 +-
BaseTools/Source/Python/GenFds/Region.py | 4 +-
BaseTools/Source/Python/GenFds/Section.py | 2 +-
BaseTools/Source/Python/GenFds/Vtf.py | 18 +-
BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 28 +-
BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py | 6 +-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 34 +-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 30 +-
BaseTools/Source/Python/Table/Table.py | 20 +-
BaseTools/Source/Python/Table/TableDataModel.py | 14 +-
BaseTools/Source/Python/Table/TableDec.py | 12 +-
BaseTools/Source/Python/Table/TableDsc.py | 12 +-
BaseTools/Source/Python/Table/TableEotReport.py | 6 +-
BaseTools/Source/Python/Table/TableFdf.py | 12 +-
BaseTools/Source/Python/Table/TableFile.py | 12 +-
BaseTools/Source/Python/Table/TableFunction.py | 8 +-
BaseTools/Source/Python/Table/TableIdentifier.py | 4 +-
BaseTools/Source/Python/Table/TableInf.py | 12 +-
BaseTools/Source/Python/Table/TablePcd.py | 4 +-
BaseTools/Source/Python/Table/TableReport.py | 6 +-
BaseTools/Source/Python/TargetTool/TargetTool.py | 24 +-
BaseTools/Source/Python/Trim/Trim.py | 20 +-
BaseTools/Source/Python/Workspace/BuildClassObject.py | 57 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 41 +-
BaseTools/Source/Python/Workspace/InfBuildData.py | 12 +-
BaseTools/Source/Python/Workspace/MetaDataTable.py | 4 +-
BaseTools/Source/Python/Workspace/MetaFileParser.py | 2 +-
BaseTools/Source/Python/Workspace/MetaFileTable.py | 88 +-
BaseTools/Source/Python/Workspace/WorkspaceDatabase.py | 24 +-
BaseTools/Source/Python/build/BuildReport.py | 36 +-
BaseTools/Source/Python/build/build.py | 19 +-
BaseTools/Source/Python/sitecustomize.py | 2 +-
94 files changed, 3207 insertions(+), 3463 deletions(-)
--
2.16.2.windows.1
_______________________________________________
edk2-devel mailing list
edk2-devel@lists.01.org
https://lists.01.org/mailman/listinfo/edk2-devel
^ permalink raw reply [flat|nested] 44+ messages in thread
end of thread, other threads:[~2018-05-04 4:33 UTC | newest]
Thread overview: 44+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2018-04-27 22:32 [PATCH v1 00/42] BaseTools: refactoring patches Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 01/42] BaseTools: FdfParser - update to remove duplicate constant value Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 02/42] BaseTools: AutoGen " Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 03/42] BaseTools: check before accessing members in __eq__ Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 04/42] BaseTools: this function has no purpose Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 05/42] BaseTools: AutoGen - refactor assemble_variable Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 06/42] BaseTools: AutoGen - refactor dictionary access Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 07/42] BaseTools: AutoGen - GenVar refactor static methods Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 08/42] BaseTools: AutoGen - share StripComments API Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 09/42] BaseTools: AutoGen - refactor class factory Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 10/42] BaseTools: Eot - remove unused lists Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 11/42] BaseTools: Eot - refactor global data Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 12/42] BaseTools: AutoGen - remove global line Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 13/42] BaseTools: AutoGen - UniClassObject refactor static methods Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 14/42] BaseTools: refactor to use list not dict Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 15/42] BaseTools: eliminate {} from dictionary contructor call Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 16/42] BaseTools: remove Compound statements Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 17/42] BaseTools: Workspace - refactor a dict Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 18/42] BaseTools: move PCD size calculation functions to PcdClassObject Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 19/42] BaseTools: AutoGen - refactor out functions only called in __init__ Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 20/42] BaseTools: AutoGen - refactor out a list Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 21/42] BaseTools: AutoGen - refactor out a useless class Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 22/42] BaseTools: AutoGen - no need to recompute Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 23/42] BaseTools: refactor __init__ functions to not compute temporary variable Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 24/42] BaseTools: AutoGen - remove function no one calls Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 25/42] BaseTools: AutoGen - move function to clean file namespace Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 26/42] BaseTools: AutoGen - remove another function no one calls Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 27/42] BaseTools: Refactor to share GUID packing function Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 28/42] BaseTools: AutoGen - refactor function to remove extra variables Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 29/42] BaseTools: AutoGen - refactor more functions only called in __init__ Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 30/42] BaseTools: remove unused member variable Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 31/42] BaseTools: remove redundant content in InfSectionParser Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 32/42] BaseTools: trim whitespace Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 33/42] BaseTools: AutoGen - add Opcode constants Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 34/42] BaseTools: standardize GUID and pack size Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 35/42] BaseTools: remove unused variable Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 36/42] BaseTools: GenFds - use existing shared string Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 37/42] BaseTools: missed a copyright update Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 38/42] BaseTools: Remove lists form set construction Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 39/42] BaseTools: refactor Depex optomization Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 40/42] BaseTools: dont make iterator into list if not needed Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 41/42] BaseTools: create base expression class Jaben Carsey
2018-04-27 22:32 ` [PATCH v1 42/42] BaseTools: use set instead of list Jaben Carsey
2018-05-04 4:33 ` [PATCH v1 00/42] BaseTools: refactoring patches Zhu, Yonghong
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox