From mboxrd@z Thu Jan 1 00:00:00 1970 Received: from mga07.intel.com (mga07.intel.com [134.134.136.100]) by mx.groups.io with SMTP id smtpd.web09.4153.1620092720993565256 for ; Mon, 03 May 2021 18:45:23 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@intel.onmicrosoft.com header.s=selector2-intel-onmicrosoft-com header.b=ZstSmtPh; spf=pass (domain: intel.com, ip: 134.134.136.100, mailfrom: chasel.chiu@intel.com) IronPort-SDR: jPYa1Q0MS1z9YdrpXoSq77qovu2A0A5wty/QkpkEGFo5+Fe5z62zgtiWffI0hRl/ejXQtE69Gb ORWqqdKGl1gg== X-IronPort-AV: E=McAfee;i="6200,9189,9973"; a="261826454" X-IronPort-AV: E=Sophos;i="5.82,271,1613462400"; d="scan'208";a="261826454" Received: from orsmga004.jf.intel.com ([10.7.209.38]) by orsmga105.jf.intel.com with ESMTP/TLS/ECDHE-RSA-AES256-GCM-SHA384; 03 May 2021 18:45:18 -0700 IronPort-SDR: MBVmTXYCca1BcAXAZFz5SvS4WQ4Afud5vsiuuYY/yC6GUdBx6fqVbDL5gDg5AjGSJll+LtQBYo rlf5Fzhxnk2g== X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.82,271,1613462400"; d="scan'208";a="538997085" Received: from fmsmsx606.amr.corp.intel.com ([10.18.126.86]) by orsmga004.jf.intel.com with ESMTP; 03 May 2021 18:45:18 -0700 Received: from fmsmsx606.amr.corp.intel.com (10.18.126.86) by fmsmsx606.amr.corp.intel.com (10.18.126.86) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256) id 15.1.2106.2; Mon, 3 May 2021 18:45:16 -0700 Received: from FMSEDG603.ED.cps.intel.com (10.1.192.133) by fmsmsx606.amr.corp.intel.com (10.18.126.86) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256) id 15.1.2106.2 via Frontend Transport; Mon, 3 May 2021 18:45:16 -0700 Received: from NAM10-BN7-obe.outbound.protection.outlook.com (104.47.70.104) by edgegateway.intel.com (192.55.55.68) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.1.2106.2; Mon, 3 May 2021 18:45:16 -0700 ARC-Seal: i=1; a=rsa-sha256; s=arcselector9901; d=microsoft.com; cv=none; b=Fm6ZBI0A8ENZVnKGH+Wat02DojHBzSkKaMSogybhzxNZMqVUQ9SawxArPOvzEjiK+dma0UXIvcaiYDQ3REJqOI3iSaAFaPZvrhZ/8GBneQduvXa92BGyCB0r/7qOkcqNk0UBlmzoxj9a2KA7dk8a6ZFU6zcI+W5FrnuffuOZknP46Lbvzv+WgSjgDMzIsRYpSmu1QmjBINTtIDUUolY6TXEdwSc0yngWquHt1ChhecZ4Us6yq6YWGjzcTpQlVANMzKBuahqlydK7gV/nZYYxN2hQVjI4TAe9Nh5u/lfS+gIpp3FeXZ4zVjhpTs8ly+HaVOy7CH77/ak5gYpDoW0fGw== ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=microsoft.com; s=arcselector9901; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-SenderADCheck; bh=dmpUHcgSh7AjLYtO7MWXG5Nc3au3sjsIzB9ZfqWmfOk=; b=XepImsbcng1ACgDrxj5q3ZB2JTdu4hKa3xnvRDjobpNNdZWHZ9IZZCP61GOzwxZEOllCmbXgikwAM5O0k8pRNVLZHK32FoBASMphOUrrbVmJEKQzinwN5SCKyXI1a3ZnIcuK1nL+5X0qljuYjzNkjxbWsssBasRrnh+WpYwLPGBHSoV9jYESXGw8DN/mnFZG/SqBDxmJY9Ey8AvXRbNOLQJc/FrzFb8sS5pvkbEw53poF9YIC36GXaD5/XeBrzbntm1WhcsK+126N8wHz1u7A/ZnemOu7xY3WJjwiW1pNQyoYs2JQRlkr82duKL9zGnE8Aq2ayTqA0+fORJDV2ltTw== ARC-Authentication-Results: i=1; mx.microsoft.com 1; spf=pass smtp.mailfrom=intel.com; dmarc=pass action=none header.from=intel.com; dkim=pass header.d=intel.com; arc=none DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=intel.onmicrosoft.com; s=selector2-intel-onmicrosoft-com; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-SenderADCheck; bh=dmpUHcgSh7AjLYtO7MWXG5Nc3au3sjsIzB9ZfqWmfOk=; b=ZstSmtPh7+QrrDTey8GJFzLwRgAF+x3ZmwvgHaH3lB1YIydFof3qvHhGyv/g5h9dEn7DeXLHFQ0MoAe45O6bMIrnAbyMDM39T9/iWIlGEUj8sSmGOh6aouUui6O13RTApftwzhV3BEclxnPk9/vta11dpEhPnXaEAHXyCbQpGI0= Received: from SN6PR11MB2814.namprd11.prod.outlook.com (2603:10b6:805:55::15) by SA0PR11MB4765.namprd11.prod.outlook.com (2603:10b6:806:9b::13) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.4087.41; Tue, 4 May 2021 01:45:12 +0000 Received: from SN6PR11MB2814.namprd11.prod.outlook.com ([fe80::3101:f483:8348:816]) by SN6PR11MB2814.namprd11.prod.outlook.com ([fe80::3101:f483:8348:816%7]) with mapi id 15.20.4087.025; Tue, 4 May 2021 01:45:12 +0000 From: "Chiu, Chasel" To: "Loo, Tung Lun" , "devel@edk2.groups.io" CC: "Ma, Maurice" , "Desimone, Nathaniel L" , "Zeng, Star" Subject: Re: [PATCH v4] IntelFsp2Pkg: Add Config Editor tool support Thread-Topic: [PATCH v4] IntelFsp2Pkg: Add Config Editor tool support Thread-Index: AQHXPZUEAcdgurXmbEqBaws98dlFhKrSkF+g Date: Tue, 4 May 2021 01:45:12 +0000 Message-ID: References: <20210430074646.707-1-tung.lun.loo@intel.com> In-Reply-To: <20210430074646.707-1-tung.lun.loo@intel.com> Accept-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: dlp-product: dlpe-windows dlp-reaction: no-action dlp-version: 11.5.1.3 authentication-results: intel.com; dkim=none (message not signed) header.d=none;intel.com; dmarc=none action=none header.from=intel.com; x-originating-ip: [192.55.52.202] x-ms-publictraffictype: Email x-ms-office365-filtering-correlation-id: 63e75cd4-f310-4b5c-1f1b-08d90e9e423b x-ms-traffictypediagnostic: SA0PR11MB4765: x-ms-exchange-transport-forked: True x-microsoft-antispam-prvs: x-ms-oob-tlc-oobclassifiers: OLM:1002; x-ms-exchange-senderadcheck: 1 x-microsoft-antispam: BCL:0; x-microsoft-antispam-message-info: CaTxBSReTA7tcBZtNY/0jVAwVN3CFSZi4hd8u0VfUtGLoJEUKUZBbWD5sCCH8AsvMIdyQA3dQ6r5ziEHT+WxSsdnlPhiFwFJL2uuhGfT1cWM0uarg31zCi6uqYL/oCDaA7PHUcPX+pv7SrUOCqsP3BSVsptDdp1InEya7/ALXh1bqoz7xupLfD0WKKr0+m+xojATExKonBQxmngFebeLmcB+sOiO2ORssvqAtFe04lF0PC6mwTYC24XK/FcNeAByqgF42HQpBLyd6TZXdcLTMuHB9Ymo3ALvWuBMl7TFsuLgXe8Lk9SJgvnVRnc5JxX/0lqDpmRbGsozeEyMfcsl4MiFQXtjXzVbi70/deeyfTH+NVFQHFrG3LBXjlvF2d4M85OwpGWbvkSodAyYU6BFrOWQsOymlEdgq+BXRbuGgQzk9TgWQweDHBnNceRFdpi5xbHu7opsuTwMUflnP3jDtreD1ELzzXQEykyCz9UPGnhuoPVB0GQO4JQoSUp0NAe8huwgHbHS8M0a6flH0ukyNSYzLr/6p43PFCBhPA275HST98rr+0Hl3HzF9fxyeIW1dSXVILo9MIw0nzJGivEPF7SDfMpsZqhS6Wutj6AHGHwrFjBjIFwxGrc1zpJkMHnWV+V1DdlmQTVWwjTfBYR7xWyhxCWlOxvI1Fb5/nU3+tqfm+On0wBh+NmHJQxToH8m1bj/p5bVCdptlSoO9g9aJdXGDJngma2PjYyYQCQsVJk= x-forefront-antispam-report: CIP:255.255.255.255;CTRY:;LANG:en;SCL:1;SRV:;IPV:NLI;SFV:NSPM;H:SN6PR11MB2814.namprd11.prod.outlook.com;PTR:;CAT:NONE;SFS:(346002)(39860400002)(366004)(396003)(376002)(136003)(5660300002)(110136005)(8676002)(38100700002)(66446008)(4326008)(66476007)(66556008)(64756008)(478600001)(19627235002)(86362001)(52536014)(316002)(55016002)(122000001)(2906002)(83380400001)(9686003)(53546011)(8936002)(6506007)(186003)(71200400001)(33656002)(54906003)(30864003)(76116006)(7696005)(107886003)(26005)(66946007)(569008);DIR:OUT;SFP:1102; x-ms-exchange-antispam-messagedata: =?us-ascii?Q?UowBDgxU+6OCcnfCHNWUjhcejgCNJ1gPSBAgj9wf7PAJelxgA+wSVhFJKE1T?= =?us-ascii?Q?D5MveSq/DD+oK3M2Y6PljS9ZXOEzUPv1ybz7qPpIMlT++4wQUtTdm/hP1nlz?= =?us-ascii?Q?CBmQ3+y5jUOjFjPlHYiBPMWn7C9aufmpmHmA6YTfB/OlVe18glpGtfaaQBtV?= =?us-ascii?Q?So5CPlXzXzOQZh8gLJjgshVbqO2NTQHgxZSihp6aUVFqXUYe+xuh3171v179?= =?us-ascii?Q?OCcFD2KUbUfHZY9/sxrVNS4chJ/UPaHa+LNoyo8NjYVu5AZ4XOy8dhTp4EC1?= =?us-ascii?Q?s/3MGP4vUrNkjkh+1jh1hDLUVmODEeyUgxSFWccBKq+Z+XheRNLmCi/CKozY?= =?us-ascii?Q?MAdUHVcm6xCq1JoZVnboCz90p8se1wHq+6SQQSMVj7JytTZBB5JZ1tRVBzPH?= =?us-ascii?Q?UnErIJFYkj+XyKwJDCBjk23CjMCcggEROAHiu5xJhjPZ1MbwFRxbYYsiqd+V?= =?us-ascii?Q?qalJIXeqx7aVjjudlbRHIRPQihzXmTAFKC1hLWKWeKtG0wKR90N9ip84LOpL?= =?us-ascii?Q?y4ke1IH3UjqfvSc3Hkm5LA5w+LfFD6VkmD5hJWXhetjDtRkcSWm3JSKDnrdt?= =?us-ascii?Q?r5Quz0/r/KM+1iYcE8IbdfS1LnqeMZIkjkdJEakS+bZwPkgwWzdIkQLAm04O?= =?us-ascii?Q?QkglxJimv9HH/4jEtBJMnURfYIKOvpTpmuntSCr32y1CcLx1FzsxE7QSjBLt?= =?us-ascii?Q?VcbTk1TOsEaUa1i3NM37sRp8Rn8WbG5zy9SPQKFThrrKdojX7svErSk8upHp?= =?us-ascii?Q?NBCy+pCmKBQy5OCttmPMdwVp3yQF7RAVaS17dd5o/ucjUZTNU54jxvpBDYvd?= =?us-ascii?Q?VK7irLAxgERWDNhsCbKuHOEnCiQMK2pc6TRdCkd8cRF/brrFWllWmjMGI5BZ?= =?us-ascii?Q?qVo1Z3FwQgr+sb8eE8VuMO179uFSHTAQRKZcJTbs0ZKgdUjLYf/aZ/DOpuD/?= =?us-ascii?Q?80RFF43zQskOFTBdQnDIx9Es7ir7MT0935arZBF+gm44Sxlz9Crto3eWQUEC?= =?us-ascii?Q?iuWIpQk/vsgadzT52I6PdUHkSOZMu0BWeZta23Ak5JxuVMFzQWei7kiSgm3Q?= =?us-ascii?Q?D6hnQKEE3yRD3o82BtvLkEQmYWxmyxc0VA1F64NEeJLTtpMzPissC8M3SkEX?= =?us-ascii?Q?DfLuUv717jY2fACG6400mTUfwOU6/V47eFFTPDnW3bvFeq5+9HclHdTizPks?= =?us-ascii?Q?GFHyt9prue2a/3sqxlD1qef7Xsiznu9wZ4LtX7zUN1mJOTPbSptbsFpaKAdG?= =?us-ascii?Q?AaW0SKSPAwnRtU245KncMmlF0aYON4WuX9epLNJJchxhJLVpt/BXPpL7cuRC?= =?us-ascii?Q?y3RGiM+nKegKgzCDPURpWvVG?= MIME-Version: 1.0 X-MS-Exchange-CrossTenant-AuthAs: Internal X-MS-Exchange-CrossTenant-AuthSource: SN6PR11MB2814.namprd11.prod.outlook.com X-MS-Exchange-CrossTenant-Network-Message-Id: 63e75cd4-f310-4b5c-1f1b-08d90e9e423b X-MS-Exchange-CrossTenant-originalarrivaltime: 04 May 2021 01:45:12.4920 (UTC) X-MS-Exchange-CrossTenant-fromentityheader: Hosted X-MS-Exchange-CrossTenant-id: 46c98d88-e344-4ed4-8496-4ed7712e255d X-MS-Exchange-CrossTenant-mailboxtype: HOSTED X-MS-Exchange-CrossTenant-userprincipalname: /S7P78Zs4tWf8YG/ETdcukYhqDlUTSGAKciQtNMcrr/Xu0w9H/88amCLtsmPO3FshrL1nA1HezFImaVrPBab2w== X-MS-Exchange-Transport-CrossTenantHeadersStamped: SA0PR11MB4765 Return-Path: chasel.chiu@intel.com X-OriginatorOrg: intel.com Content-Language: en-US Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Hi Tung Lun, Thanks for adding BSF support and "Show Binary Configuration" feature. For BSF and binary patching, I'm just trying to compare with BCT tool behav= ior and do you think we could align with BCT steps so no change to user exp= erience when using this new tool? BCT binary patching steps: open BSF -> modify some UPD value in UI -> patc= h fsp.fd In terms of showing FSP binary information, I encountered below error, plea= se help to check it. Thanks, Chasel Error in "Show Binary Configuration" Exception in Tkinter callback Traceback (most recent call last): File "C:\Python38\lib\tkinter\__init__.py", line 1883, in __call__ return self.func(*args) File "ConfigEditor.py", line 1231, in load_from_fd self.load_fd_file(path) File "ConfigEditor.py", line 1240, in load_fd_file fd.OutputFsp() File "ConfigEditor.py", line 721, in OutputFsp self.OutputText +=3D str(self.BuildList[i].decode('utf-8')) + "\n" UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa8 in position 6: inv= alid start byte > -----Original Message----- > From: Loo, Tung Lun > Sent: Friday, April 30, 2021 3:47 PM > To: devel@edk2.groups.io > Cc: Loo, Tung Lun ; Ma, Maurice > ; Desimone, Nathaniel L > ; Zeng, Star ; Chiu, > Chasel > Subject: [PATCH v4] IntelFsp2Pkg: Add Config Editor tool support >=20 > This is a GUI interface that can be used by users who > would like to change configuration settings directly > from the interface without having to modify the source. >=20 > This tool depends on Python GUI tool kit Tkinter. > It runs on both Windows and Linux. >=20 > The user needs to load the YAML file along with DLT file > for a specific board into the ConfigEditor, change the desired > configuration values. Finally, generate a new configuration delta > file or a config binary blob for the newly changed values to take > effect. These will be the inputs to the merge tool or the stitch > tool so that new config changes can be merged and stitched into > the final configuration blob. >=20 > This tool also supports binary update directly and display FSP > information. It is also backward compatible for BSF file format. >=20 > Running Configuration Editor: > python ConfigEditor.py >=20 > Co-authored-by: Maurice Ma > Cc: Maurice Ma > Cc: Nate DeSimone > Cc: Star Zeng > Cc: Chasel Chiu > Signed-off-by: Loo Tung Lun > --- > IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py | 504 > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++ > IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py | 1467 > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++ > IntelFsp2Pkg/Tools/ConfigEditor/FspDscBsf2Yaml.py | 664 > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > ++++++++++++++++++++++++++++++++++++++++++++ > IntelFsp2Pkg/Tools/ConfigEditor/FspGenCfgData.py | 2598 > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > ++++++++++++++++++++++++ > IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py | 2241 > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > +++++++++++++++++++++++++ > IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py | 324 > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > ++++++++++++++++++++++++++++++++++++++++++++++++++++ > IntelFsp2Pkg/Tools/UserManuals/ConfigEditorUserManual.md | 46 > +++++++++++++++++ > 7 files changed, 7844 insertions(+) >=20 > diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py > b/IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py > new file mode 100644 > index 0000000000..757e63150f > --- /dev/null > +++ b/IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py > @@ -0,0 +1,504 @@ > +#!/usr/bin/env python >=20 > +# @ CommonUtility.py >=20 > +# Common utility script >=20 > +# >=20 > +# Copyright (c) 2016 - 2020, Intel Corporation. All rights reserved.
>=20 > +# SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > +# >=20 > +## >=20 > + >=20 > +import os >=20 > +import sys >=20 > +import shutil >=20 > +import subprocess >=20 > +import string >=20 > +from ctypes import ARRAY, c_char, c_uint16, c_uint32, \ >=20 > + c_uint8, Structure, sizeof >=20 > +from importlib.machinery import SourceFileLoader >=20 > +from SingleSign import single_sign_gen_pub_key >=20 > + >=20 > + >=20 > +# Key types defined should match with cryptolib.h >=20 > +PUB_KEY_TYPE =3D { >=20 > + "RSA": 1, >=20 > + "ECC": 2, >=20 > + "DSA": 3, >=20 > + } >=20 > + >=20 > +# Signing type schemes defined should match with cryptolib.h >=20 > +SIGN_TYPE_SCHEME =3D { >=20 > + "RSA_PKCS1": 1, >=20 > + "RSA_PSS": 2, >=20 > + "ECC": 3, >=20 > + "DSA": 4, >=20 > + } >=20 > + >=20 > +# Hash values defined should match with cryptolib.h >=20 > +HASH_TYPE_VALUE =3D { >=20 > + "SHA2_256": 1, >=20 > + "SHA2_384": 2, >=20 > + "SHA2_512": 3, >=20 > + "SM3_256": 4, >=20 > + } >=20 > + >=20 > +# Hash values defined should match with cryptolib.h >=20 > +HASH_VAL_STRING =3D dict(map(reversed, HASH_TYPE_VALUE.items())) >=20 > + >=20 > +AUTH_TYPE_HASH_VALUE =3D { >=20 > + "SHA2_256": 1, >=20 > + "SHA2_384": 2, >=20 > + "SHA2_512": 3, >=20 > + "SM3_256": 4, >=20 > + "RSA2048SHA256": 1, >=20 > + "RSA3072SHA384": 2, >=20 > + } >=20 > + >=20 > +HASH_DIGEST_SIZE =3D { >=20 > + "SHA2_256": 32, >=20 > + "SHA2_384": 48, >=20 > + "SHA2_512": 64, >=20 > + "SM3_256": 32, >=20 > + } >=20 > + >=20 > + >=20 > +class PUB_KEY_HDR (Structure): >=20 > + _pack_ =3D 1 >=20 > + _fields_ =3D [ >=20 > + ('Identifier', ARRAY(c_char, 4)), # signature ('P', 'U', 'B= ', 'K') >=20 > + ('KeySize', c_uint16), # Length of Public Key >=20 > + ('KeyType', c_uint8), # RSA or ECC >=20 > + ('Reserved', ARRAY(c_uint8, 1)), >=20 > + ('KeyData', ARRAY(c_uint8, 0)), >=20 > + ] >=20 > + >=20 > + def __init__(self): >=20 > + self.Identifier =3D b'PUBK' >=20 > + >=20 > + >=20 > +class SIGNATURE_HDR (Structure): >=20 > + _pack_ =3D 1 >=20 > + _fields_ =3D [ >=20 > + ('Identifier', ARRAY(c_char, 4)), >=20 > + ('SigSize', c_uint16), >=20 > + ('SigType', c_uint8), >=20 > + ('HashAlg', c_uint8), >=20 > + ('Signature', ARRAY(c_uint8, 0)), >=20 > + ] >=20 > + >=20 > + def __init__(self): >=20 > + self.Identifier =3D b'SIGN' >=20 > + >=20 > + >=20 > +class LZ_HEADER(Structure): >=20 > + _pack_ =3D 1 >=20 > + _fields_ =3D [ >=20 > + ('signature', ARRAY(c_char, 4)), >=20 > + ('compressed_len', c_uint32), >=20 > + ('length', c_uint32), >=20 > + ('version', c_uint16), >=20 > + ('svn', c_uint8), >=20 > + ('attribute', c_uint8) >=20 > + ] >=20 > + _compress_alg =3D { >=20 > + b'LZDM': 'Dummy', >=20 > + b'LZ4 ': 'Lz4', >=20 > + b'LZMA': 'Lzma', >=20 > + } >=20 > + >=20 > + >=20 > +def print_bytes(data, indent=3D0, offset=3D0, show_ascii=3DFalse): >=20 > + bytes_per_line =3D 16 >=20 > + printable =3D ' ' + string.ascii_letters + string.digits + string.pu= nctuation >=20 > + str_fmt =3D '{:s}{:04x}: {:%ds} {:s}' % (bytes_per_line * 3) >=20 > + bytes_per_line >=20 > + data_array =3D bytearray(data) >=20 > + for idx in range(0, len(data_array), bytes_per_line): >=20 > + hex_str =3D ' '.join( >=20 > + '%02X' % val for val in data_array[idx:idx + bytes_per_line]= ) >=20 > + asc_str =3D ''.join('%c' % (val if (chr(val) in printable) else = '.') >=20 > + for val in data_array[idx:idx + bytes_per_line= ]) >=20 > + print(str_fmt.format( >=20 > + indent * ' ', >=20 > + offset + idx, hex_str, >=20 > + ' ' + asc_str if show_ascii else '')) >=20 > + >=20 > + >=20 > +def get_bits_from_bytes(bytes, start, length): >=20 > + if length =3D=3D 0: >=20 > + return 0 >=20 > + byte_start =3D (start) // 8 >=20 > + byte_end =3D (start + length - 1) // 8 >=20 > + bit_start =3D start & 7 >=20 > + mask =3D (1 << length) - 1 >=20 > + val =3D bytes_to_value(bytes[byte_start:byte_end + 1]) >=20 > + val =3D (val >> bit_start) & mask >=20 > + return val >=20 > + >=20 > + >=20 > +def set_bits_to_bytes(bytes, start, length, bvalue): >=20 > + if length =3D=3D 0: >=20 > + return >=20 > + byte_start =3D (start) // 8 >=20 > + byte_end =3D (start + length - 1) // 8 >=20 > + bit_start =3D start & 7 >=20 > + mask =3D (1 << length) - 1 >=20 > + val =3D bytes_to_value(bytes[byte_start:byte_end + 1]) >=20 > + val &=3D ~(mask << bit_start) >=20 > + val |=3D ((bvalue & mask) << bit_start) >=20 > + bytes[byte_start:byte_end+1] =3D value_to_bytearray( >=20 > + val, >=20 > + byte_end + 1 - byte_start) >=20 > + >=20 > + >=20 > +def value_to_bytes(value, length): >=20 > + return value.to_bytes(length, 'little') >=20 > + >=20 > + >=20 > +def bytes_to_value(bytes): >=20 > + return int.from_bytes(bytes, 'little') >=20 > + >=20 > + >=20 > +def value_to_bytearray(value, length): >=20 > + return bytearray(value_to_bytes(value, length)) >=20 > + >=20 > +# def value_to_bytearray (value, length): >=20 > + return bytearray(value_to_bytes(value, length)) >=20 > + >=20 > + >=20 > +def get_aligned_value(value, alignment=3D4): >=20 > + if alignment !=3D (1 << (alignment.bit_length() - 1)): >=20 > + raise Exception( >=20 > + 'Alignment (0x%x) should to be power of 2 !' % alignment) >=20 > + value =3D (value + (alignment - 1)) & ~(alignment - 1) >=20 > + return value >=20 > + >=20 > + >=20 > +def get_padding_length(data_len, alignment=3D4): >=20 > + new_data_len =3D get_aligned_value(data_len, alignment) >=20 > + return new_data_len - data_len >=20 > + >=20 > + >=20 > +def get_file_data(file, mode=3D'rb'): >=20 > + return open(file, mode).read() >=20 > + >=20 > + >=20 > +def gen_file_from_object(file, object): >=20 > + open(file, 'wb').write(object) >=20 > + >=20 > + >=20 > +def gen_file_with_size(file, size): >=20 > + open(file, 'wb').write(b'\xFF' * size) >=20 > + >=20 > + >=20 > +def check_files_exist(base_name_list, dir=3D'', ext=3D''): >=20 > + for each in base_name_list: >=20 > + if not os.path.exists(os.path.join(dir, each + ext)): >=20 > + return False >=20 > + return True >=20 > + >=20 > + >=20 > +def load_source(name, filepath): >=20 > + mod =3D SourceFileLoader(name, filepath).load_module() >=20 > + return mod >=20 > + >=20 > + >=20 > +def get_openssl_path(): >=20 > + if os.name =3D=3D 'nt': >=20 > + if 'OPENSSL_PATH' not in os.environ: >=20 > + openssl_dir =3D "C:\\Openssl\\bin\\" >=20 > + if os.path.exists(openssl_dir): >=20 > + os.environ['OPENSSL_PATH'] =3D openssl_dir >=20 > + else: >=20 > + os.environ['OPENSSL_PATH'] =3D "C:\\Openssl\\" >=20 > + if 'OPENSSL_CONF' not in os.environ: >=20 > + openssl_cfg =3D "C:\\Openssl\\openssl.cfg" >=20 > + if os.path.exists(openssl_cfg): >=20 > + os.environ['OPENSSL_CONF'] =3D openssl_cfg >=20 > + openssl =3D os.path.join( >=20 > + os.environ.get('OPENSSL_PATH', ''), >=20 > + 'openssl.exe') >=20 > + else: >=20 > + # Get openssl path for Linux cases >=20 > + openssl =3D shutil.which('openssl') >=20 > + >=20 > + return openssl >=20 > + >=20 > + >=20 > +def run_process(arg_list, print_cmd=3DFalse, capture_out=3DFalse): >=20 > + sys.stdout.flush() >=20 > + if os.name =3D=3D 'nt' and os.path.splitext(arg_list[0])[1] =3D=3D '= ' and \ >=20 > + os.path.exists(arg_list[0] + '.exe'): >=20 > + arg_list[0] +=3D '.exe' >=20 > + if print_cmd: >=20 > + print(' '.join(arg_list)) >=20 > + >=20 > + exc =3D None >=20 > + result =3D 0 >=20 > + output =3D '' >=20 > + try: >=20 > + if capture_out: >=20 > + output =3D subprocess.check_output(arg_list).decode() >=20 > + else: >=20 > + result =3D subprocess.call(arg_list) >=20 > + except Exception as ex: >=20 > + result =3D 1 >=20 > + exc =3D ex >=20 > + >=20 > + if result: >=20 > + if not print_cmd: >=20 > + print('Error in running process:\n %s' % ' '.join(arg_list)= ) >=20 > + if exc is None: >=20 > + sys.exit(1) >=20 > + else: >=20 > + raise exc >=20 > + >=20 > + return output >=20 > + >=20 > + >=20 > +# Adjust hash type algorithm based on Public key file >=20 > +def adjust_hash_type(pub_key_file): >=20 > + key_type =3D get_key_type(pub_key_file) >=20 > + if key_type =3D=3D 'RSA2048': >=20 > + hash_type =3D 'SHA2_256' >=20 > + elif key_type =3D=3D 'RSA3072': >=20 > + hash_type =3D 'SHA2_384' >=20 > + else: >=20 > + hash_type =3D None >=20 > + >=20 > + return hash_type >=20 > + >=20 > + >=20 > +def rsa_sign_file( >=20 > + priv_key, pub_key, hash_type, sign_scheme, >=20 > + in_file, out_file, inc_dat=3DFalse, inc_key=3DFalse): >=20 > + >=20 > + bins =3D bytearray() >=20 > + if inc_dat: >=20 > + bins.extend(get_file_data(in_file)) >=20 > + >=20 > + >=20 > +# def single_sign_file(priv_key, hash_type, sign_scheme, in_file, out_fi= le): >=20 > + >=20 > + out_data =3D get_file_data(out_file) >=20 > + >=20 > + sign =3D SIGNATURE_HDR() >=20 > + sign.SigSize =3D len(out_data) >=20 > + sign.SigType =3D SIGN_TYPE_SCHEME[sign_scheme] >=20 > + sign.HashAlg =3D HASH_TYPE_VALUE[hash_type] >=20 > + >=20 > + bins.extend(bytearray(sign) + out_data) >=20 > + if inc_key: >=20 > + key =3D gen_pub_key(priv_key, pub_key) >=20 > + bins.extend(key) >=20 > + >=20 > + if len(bins) !=3D len(out_data): >=20 > + gen_file_from_object(out_file, bins) >=20 > + >=20 > + >=20 > +def get_key_type(in_key): >=20 > + >=20 > + # Check in_key is file or key Id >=20 > + if not os.path.exists(in_key): >=20 > + key =3D bytearray(gen_pub_key(in_key)) >=20 > + else: >=20 > + # Check for public key in binary format. >=20 > + key =3D bytearray(get_file_data(in_key)) >=20 > + >=20 > + pub_key_hdr =3D PUB_KEY_HDR.from_buffer(key) >=20 > + if pub_key_hdr.Identifier !=3D b'PUBK': >=20 > + pub_key =3D gen_pub_key(in_key) >=20 > + pub_key_hdr =3D PUB_KEY_HDR.from_buffer(pub_key) >=20 > + >=20 > + key_type =3D next( >=20 > + (key for key, >=20 > + value in PUB_KEY_TYPE.items() if value =3D=3D pub_key_hdr.Ke= yType)) >=20 > + return '%s%d' % (key_type, (pub_key_hdr.KeySize - 4) * 8) >=20 > + >=20 > + >=20 > +def get_auth_hash_type(key_type, sign_scheme): >=20 > + if key_type =3D=3D "RSA2048" and sign_scheme =3D=3D "RSA_PKCS1": >=20 > + hash_type =3D 'SHA2_256' >=20 > + auth_type =3D 'RSA2048_PKCS1_SHA2_256' >=20 > + elif key_type =3D=3D "RSA3072" and sign_scheme =3D=3D "RSA_PKCS1": >=20 > + hash_type =3D 'SHA2_384' >=20 > + auth_type =3D 'RSA3072_PKCS1_SHA2_384' >=20 > + elif key_type =3D=3D "RSA2048" and sign_scheme =3D=3D "RSA_PSS": >=20 > + hash_type =3D 'SHA2_256' >=20 > + auth_type =3D 'RSA2048_PSS_SHA2_256' >=20 > + elif key_type =3D=3D "RSA3072" and sign_scheme =3D=3D "RSA_PSS": >=20 > + hash_type =3D 'SHA2_384' >=20 > + auth_type =3D 'RSA3072_PSS_SHA2_384' >=20 > + else: >=20 > + hash_type =3D '' >=20 > + auth_type =3D '' >=20 > + return auth_type, hash_type >=20 > + >=20 > + >=20 > +# def single_sign_gen_pub_key(in_key, pub_key_file=3DNone): >=20 > + >=20 > + >=20 > +def gen_pub_key(in_key, pub_key=3DNone): >=20 > + >=20 > + keydata =3D single_sign_gen_pub_key(in_key, pub_key) >=20 > + >=20 > + publickey =3D PUB_KEY_HDR() >=20 > + publickey.KeySize =3D len(keydata) >=20 > + publickey.KeyType =3D PUB_KEY_TYPE['RSA'] >=20 > + >=20 > + key =3D bytearray(publickey) + keydata >=20 > + >=20 > + if pub_key: >=20 > + gen_file_from_object(pub_key, key) >=20 > + >=20 > + return key >=20 > + >=20 > + >=20 > +def decompress(in_file, out_file, tool_dir=3D''): >=20 > + if not os.path.isfile(in_file): >=20 > + raise Exception("Invalid input file '%s' !" % in_file) >=20 > + >=20 > + # Remove the Lz Header >=20 > + fi =3D open(in_file, 'rb') >=20 > + di =3D bytearray(fi.read()) >=20 > + fi.close() >=20 > + >=20 > + lz_hdr =3D LZ_HEADER.from_buffer(di) >=20 > + offset =3D sizeof(lz_hdr) >=20 > + if lz_hdr.signature =3D=3D b"LZDM" or lz_hdr.compressed_len =3D=3D 0= : >=20 > + fo =3D open(out_file, 'wb') >=20 > + fo.write(di[offset:offset + lz_hdr.compressed_len]) >=20 > + fo.close() >=20 > + return >=20 > + >=20 > + temp =3D os.path.splitext(out_file)[0] + '.tmp' >=20 > + if lz_hdr.signature =3D=3D b"LZMA": >=20 > + alg =3D "Lzma" >=20 > + elif lz_hdr.signature =3D=3D b"LZ4 ": >=20 > + alg =3D "Lz4" >=20 > + else: >=20 > + raise Exception("Unsupported compression '%s' !" % lz_hdr.signat= ure) >=20 > + >=20 > + fo =3D open(temp, 'wb') >=20 > + fo.write(di[offset:offset + lz_hdr.compressed_len]) >=20 > + fo.close() >=20 > + >=20 > + compress_tool =3D "%sCompress" % alg >=20 > + if alg =3D=3D "Lz4": >=20 > + try: >=20 > + cmdline =3D [ >=20 > + os.path.join(tool_dir, compress_tool), >=20 > + "-d", >=20 > + "-o", out_file, >=20 > + temp] >=20 > + run_process(cmdline, False, True) >=20 > + except Exception: >=20 > + msg_string =3D "Could not find/use CompressLz4 tool, " \ >=20 > + "trying with python lz4..." >=20 > + print(msg_string) >=20 > + try: >=20 > + import lz4.block >=20 > + if lz4.VERSION !=3D '3.1.1': >=20 > + msg_string =3D "Recommended lz4 module version " \ >=20 > + "is '3.1.1'," + lz4.VERSION \ >=20 > + + " is currently installed." >=20 > + print(msg_string) >=20 > + except ImportError: >=20 > + msg_string =3D "Could not import lz4, use " \ >=20 > + "'python -m pip install lz4=3D=3D3.1.1' " \ >=20 > + "to install it." >=20 > + print(msg_string) >=20 > + exit(1) >=20 > + decompress_data =3D lz4.block.decompress(get_file_data(temp)= ) >=20 > + with open(out_file, "wb") as lz4bin: >=20 > + lz4bin.write(decompress_data) >=20 > + else: >=20 > + cmdline =3D [ >=20 > + os.path.join(tool_dir, compress_tool), >=20 > + "-d", >=20 > + "-o", out_file, >=20 > + temp] >=20 > + run_process(cmdline, False, True) >=20 > + os.remove(temp) >=20 > + >=20 > + >=20 > +def compress(in_file, alg, svn=3D0, out_path=3D'', tool_dir=3D''): >=20 > + if not os.path.isfile(in_file): >=20 > + raise Exception("Invalid input file '%s' !" % in_file) >=20 > + >=20 > + basename, ext =3D os.path.splitext(os.path.basename(in_file)) >=20 > + if out_path: >=20 > + if os.path.isdir(out_path): >=20 > + out_file =3D os.path.join(out_path, basename + '.lz') >=20 > + else: >=20 > + out_file =3D os.path.join(out_path) >=20 > + else: >=20 > + out_file =3D os.path.splitext(in_file)[0] + '.lz' >=20 > + >=20 > + if alg =3D=3D "Lzma": >=20 > + sig =3D "LZMA" >=20 > + elif alg =3D=3D "Tiano": >=20 > + sig =3D "LZUF" >=20 > + elif alg =3D=3D "Lz4": >=20 > + sig =3D "LZ4 " >=20 > + elif alg =3D=3D "Dummy": >=20 > + sig =3D "LZDM" >=20 > + else: >=20 > + raise Exception("Unsupported compression '%s' !" % alg) >=20 > + >=20 > + in_len =3D os.path.getsize(in_file) >=20 > + if in_len > 0: >=20 > + compress_tool =3D "%sCompress" % alg >=20 > + if sig =3D=3D "LZDM": >=20 > + shutil.copy(in_file, out_file) >=20 > + compress_data =3D get_file_data(out_file) >=20 > + elif sig =3D=3D "LZ4 ": >=20 > + try: >=20 > + cmdline =3D [ >=20 > + os.path.join(tool_dir, compress_tool), >=20 > + "-e", >=20 > + "-o", out_file, >=20 > + in_file] >=20 > + run_process(cmdline, False, True) >=20 > + compress_data =3D get_file_data(out_file) >=20 > + except Exception: >=20 > + msg_string =3D "Could not find/use CompressLz4 tool, " \ >=20 > + "trying with python lz4..." >=20 > + print(msg_string) >=20 > + try: >=20 > + import lz4.block >=20 > + if lz4.VERSION !=3D '3.1.1': >=20 > + msg_string =3D "Recommended lz4 module version "= \ >=20 > + "is '3.1.1', " + lz4.VERSION \ >=20 > + + " is currently installed." >=20 > + print(msg_string) >=20 > + except ImportError: >=20 > + msg_string =3D "Could not import lz4, use " \ >=20 > + "'python -m pip install lz4=3D=3D3.1.1' = " \ >=20 > + "to install it." >=20 > + print(msg_string) >=20 > + exit(1) >=20 > + compress_data =3D lz4.block.compress( >=20 > + get_file_data(in_file), >=20 > + mode=3D'high_compression') >=20 > + elif sig =3D=3D "LZMA": >=20 > + cmdline =3D [ >=20 > + os.path.join(tool_dir, compress_tool), >=20 > + "-e", >=20 > + "-o", out_file, >=20 > + in_file] >=20 > + run_process(cmdline, False, True) >=20 > + compress_data =3D get_file_data(out_file) >=20 > + else: >=20 > + compress_data =3D bytearray() >=20 > + >=20 > + lz_hdr =3D LZ_HEADER() >=20 > + lz_hdr.signature =3D sig.encode() >=20 > + lz_hdr.svn =3D svn >=20 > + lz_hdr.compressed_len =3D len(compress_data) >=20 > + lz_hdr.length =3D os.path.getsize(in_file) >=20 > + data =3D bytearray() >=20 > + data.extend(lz_hdr) >=20 > + data.extend(compress_data) >=20 > + gen_file_from_object(out_file, data) >=20 > + >=20 > + return out_file >=20 > diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py > b/IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py > new file mode 100644 > index 0000000000..0a944d2be0 > --- /dev/null > +++ b/IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py > @@ -0,0 +1,1467 @@ > +# @ ConfigEditor.py >=20 > +# >=20 > +# Copyright(c) 2018 - 2021, Intel Corporation. All rights reserved.
>=20 > +# SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > +# >=20 > +## >=20 > + >=20 > +import os >=20 > +import sys >=20 > +import marshal >=20 > +import tkinter >=20 > +import tkinter.ttk as ttk >=20 > +import tkinter.messagebox as messagebox >=20 > +import tkinter.filedialog as filedialog >=20 > + >=20 > +from pathlib import Path >=20 > +from GenYamlCfg import CGenYamlCfg, bytes_to_value, \ >=20 > + bytes_to_bracket_str, value_to_bytes, array_str_to_value >=20 > +from ctypes import sizeof, Structure, ARRAY, c_uint8, c_uint64, c_char, = \ >=20 > + c_uint32, c_uint16 >=20 > +from functools import reduce >=20 > +from FspDscBsf2Yaml import bsf_to_dsc, dsc_to_yaml >=20 > + >=20 > + >=20 > +sys.dont_write_bytecode =3D True >=20 > + >=20 > + >=20 > +class create_tool_tip(object): >=20 > + ''' >=20 > + create a tooltip for a given widget >=20 > + ''' >=20 > + in_progress =3D False >=20 > + >=20 > + def __init__(self, widget, text=3D''): >=20 > + self.top_win =3D None >=20 > + self.widget =3D widget >=20 > + self.text =3D text >=20 > + self.widget.bind("", self.enter) >=20 > + self.widget.bind("", self.leave) >=20 > + >=20 > + def enter(self, event=3DNone): >=20 > + if self.in_progress: >=20 > + return >=20 > + if self.widget.winfo_class() =3D=3D 'Treeview': >=20 > + # Only show help when cursor is on row header. >=20 > + rowid =3D self.widget.identify_row(event.y) >=20 > + if rowid !=3D '': >=20 > + return >=20 > + else: >=20 > + x, y, cx, cy =3D self.widget.bbox("insert") >=20 > + >=20 > + cursor =3D self.widget.winfo_pointerxy() >=20 > + x =3D self.widget.winfo_rootx() + 35 >=20 > + y =3D self.widget.winfo_rooty() + 20 >=20 > + if cursor[1] > y and cursor[1] < y + 20: >=20 > + y +=3D 20 >=20 > + >=20 > + # creates a toplevel window >=20 > + self.top_win =3D tkinter.Toplevel(self.widget) >=20 > + # Leaves only the label and removes the app window >=20 > + self.top_win.wm_overrideredirect(True) >=20 > + self.top_win.wm_geometry("+%d+%d" % (x, y)) >=20 > + label =3D tkinter.Message(self.top_win, >=20 > + text=3Dself.text, >=20 > + justify=3D'left', >=20 > + background=3D'bisque', >=20 > + relief=3D'solid', >=20 > + borderwidth=3D1, >=20 > + font=3D("times", "10", "normal")) >=20 > + label.pack(ipadx=3D1) >=20 > + self.in_progress =3D True >=20 > + >=20 > + def leave(self, event=3DNone): >=20 > + if self.top_win: >=20 > + self.top_win.destroy() >=20 > + self.in_progress =3D False >=20 > + >=20 > + >=20 > +class validating_entry(tkinter.Entry): >=20 > + def __init__(self, master, **kw): >=20 > + tkinter.Entry.__init__(*(self, master), **kw) >=20 > + self.parent =3D master >=20 > + self.old_value =3D '' >=20 > + self.last_value =3D '' >=20 > + self.variable =3D tkinter.StringVar() >=20 > + self.variable.trace("w", self.callback) >=20 > + self.config(textvariable=3Dself.variable) >=20 > + self.config({"background": "#c0c0c0"}) >=20 > + self.bind("", self.move_next) >=20 > + self.bind("", self.move_next) >=20 > + self.bind("", self.cancel) >=20 > + for each in ['BackSpace', 'Delete']: >=20 > + self.bind("<%s>" % each, self.ignore) >=20 > + self.display(None) >=20 > + >=20 > + def ignore(self, even): >=20 > + return "break" >=20 > + >=20 > + def move_next(self, event): >=20 > + if self.row < 0: >=20 > + return >=20 > + row, col =3D self.row, self.col >=20 > + txt, row_id, col_id =3D self.parent.get_next_cell(row, col) >=20 > + self.display(txt, row_id, col_id) >=20 > + return "break" >=20 > + >=20 > + def cancel(self, event): >=20 > + self.variable.set(self.old_value) >=20 > + self.display(None) >=20 > + >=20 > + def display(self, txt, row_id=3D'', col_id=3D''): >=20 > + if txt is None: >=20 > + self.row =3D -1 >=20 > + self.col =3D -1 >=20 > + self.place_forget() >=20 > + else: >=20 > + row =3D int('0x' + row_id[1:], 0) - 1 >=20 > + col =3D int(col_id[1:]) - 1 >=20 > + self.row =3D row >=20 > + self.col =3D col >=20 > + self.old_value =3D txt >=20 > + self.last_value =3D txt >=20 > + x, y, width, height =3D self.parent.bbox(row_id, col) >=20 > + self.place(x=3Dx, y=3Dy, w=3Dwidth) >=20 > + self.variable.set(txt) >=20 > + self.focus_set() >=20 > + self.icursor(0) >=20 > + >=20 > + def callback(self, *Args): >=20 > + cur_val =3D self.variable.get() >=20 > + new_val =3D self.validate(cur_val) >=20 > + if new_val is not None and self.row >=3D 0: >=20 > + self.last_value =3D new_val >=20 > + self.parent.set_cell(self.row, self.col, new_val) >=20 > + self.variable.set(self.last_value) >=20 > + >=20 > + def validate(self, value): >=20 > + if len(value) > 0: >=20 > + try: >=20 > + int(value, 16) >=20 > + except Exception: >=20 > + return None >=20 > + >=20 > + # Normalize the cell format >=20 > + self.update() >=20 > + cell_width =3D self.winfo_width() >=20 > + max_len =3D custom_table.to_byte_length(cell_width) * 2 >=20 > + cur_pos =3D self.index("insert") >=20 > + if cur_pos =3D=3D max_len + 1: >=20 > + value =3D value[-max_len:] >=20 > + else: >=20 > + value =3D value[:max_len] >=20 > + if value =3D=3D '': >=20 > + value =3D '0' >=20 > + fmt =3D '%%0%dX' % max_len >=20 > + return fmt % int(value, 16) >=20 > + >=20 > + >=20 > +class custom_table(ttk.Treeview): >=20 > + _Padding =3D 20 >=20 > + _Char_width =3D 6 >=20 > + >=20 > + def __init__(self, parent, col_hdr, bins): >=20 > + cols =3D len(col_hdr) >=20 > + >=20 > + col_byte_len =3D [] >=20 > + for col in range(cols): # Columns >=20 > + col_byte_len.append(int(col_hdr[col].split(':')[1])) >=20 > + >=20 > + byte_len =3D sum(col_byte_len) >=20 > + rows =3D (len(bins) + byte_len - 1) // byte_len >=20 > + >=20 > + self.rows =3D rows >=20 > + self.cols =3D cols >=20 > + self.col_byte_len =3D col_byte_len >=20 > + self.col_hdr =3D col_hdr >=20 > + >=20 > + self.size =3D len(bins) >=20 > + self.last_dir =3D '' >=20 > + >=20 > + style =3D ttk.Style() >=20 > + style.configure("Custom.Treeview.Heading", >=20 > + font=3D('calibri', 10, 'bold'), >=20 > + foreground=3D"blue") >=20 > + ttk.Treeview.__init__(self, parent, height=3Drows, >=20 > + columns=3D[''] + col_hdr, show=3D'headings= ', >=20 > + style=3D"Custom.Treeview", >=20 > + selectmode=3D'none') >=20 > + self.bind("", self.click) >=20 > + self.bind("", self.focus_out) >=20 > + self.entry =3D validating_entry(self, width=3D4, justify=3Dtkin= ter.CENTER) >=20 > + >=20 > + self.heading(0, text=3D'LOAD') >=20 > + self.column(0, width=3D60, stretch=3D0, anchor=3Dtkinter.CENTER) >=20 > + >=20 > + for col in range(cols): # Columns >=20 > + text =3D col_hdr[col].split(':')[0] >=20 > + byte_len =3D int(col_hdr[col].split(':')[1]) >=20 > + self.heading(col+1, text=3Dtext) >=20 > + self.column(col+1, width=3Dself.to_cell_width(byte_len), >=20 > + stretch=3D0, anchor=3Dtkinter.CENTER) >=20 > + idx =3D 0 >=20 > + for row in range(rows): # Rows >=20 > + text =3D '%04X' % (row * len(col_hdr)) >=20 > + vals =3D ['%04X:' % (cols * row)] >=20 > + for col in range(cols): # Columns >=20 > + if idx >=3D len(bins): >=20 > + break >=20 > + byte_len =3D int(col_hdr[col].split(':')[1]) >=20 > + value =3D bytes_to_value(bins[idx:idx+byte_len]) >=20 > + hex =3D ("%%0%dX" % (byte_len * 2)) % value >=20 > + vals.append(hex) >=20 > + idx +=3D byte_len >=20 > + self.insert('', 'end', values=3Dtuple(vals)) >=20 > + if idx >=3D len(bins): >=20 > + break >=20 > + >=20 > + @staticmethod >=20 > + def to_cell_width(byte_len): >=20 > + return byte_len * 2 * custom_table._Char_width + custom_table._P= adding >=20 > + >=20 > + @staticmethod >=20 > + def to_byte_length(cell_width): >=20 > + return(cell_width - custom_table._Padding) \ >=20 > + // (2 * custom_table._Char_width) >=20 > + >=20 > + def focus_out(self, event): >=20 > + self.entry.display(None) >=20 > + >=20 > + def refresh_bin(self, bins): >=20 > + if not bins: >=20 > + return >=20 > + >=20 > + # Reload binary into widget >=20 > + bin_len =3D len(bins) >=20 > + for row in range(self.rows): >=20 > + iid =3D self.get_children()[row] >=20 > + for col in range(self.cols): >=20 > + idx =3D row * sum(self.col_byte_len) + \ >=20 > + sum(self.col_byte_len[:col]) >=20 > + byte_len =3D self.col_byte_len[col] >=20 > + if idx + byte_len <=3D self.size: >=20 > + byte_len =3D int(self.col_hdr[col].split(':')[1]) >=20 > + if idx + byte_len > bin_len: >=20 > + val =3D 0 >=20 > + else: >=20 > + val =3D bytes_to_value(bins[idx:idx+byte_len]) >=20 > + hex_val =3D ("%%0%dX" % (byte_len * 2)) % val >=20 > + self.set(iid, col + 1, hex_val) >=20 > + >=20 > + def get_cell(self, row, col): >=20 > + iid =3D self.get_children()[row] >=20 > + txt =3D self.item(iid, 'values')[col] >=20 > + return txt >=20 > + >=20 > + def get_next_cell(self, row, col): >=20 > + rows =3D self.get_children() >=20 > + col +=3D 1 >=20 > + if col > self.cols: >=20 > + col =3D 1 >=20 > + row +=3D 1 >=20 > + cnt =3D row * sum(self.col_byte_len) + sum(self.col_byte_len[:co= l]) >=20 > + if cnt > self.size: >=20 > + # Reached the last cell, so roll back to beginning >=20 > + row =3D 0 >=20 > + col =3D 1 >=20 > + >=20 > + txt =3D self.get_cell(row, col) >=20 > + row_id =3D rows[row] >=20 > + col_id =3D '#%d' % (col + 1) >=20 > + return(txt, row_id, col_id) >=20 > + >=20 > + def set_cell(self, row, col, val): >=20 > + iid =3D self.get_children()[row] >=20 > + self.set(iid, col, val) >=20 > + >=20 > + def load_bin(self): >=20 > + # Load binary from file >=20 > + path =3D filedialog.askopenfilename( >=20 > + initialdir=3Dself.last_dir, >=20 > + title=3D"Load binary file", >=20 > + filetypes=3D(("Binary files", "*.bin"), ( >=20 > + "binary files", "*.bin"))) >=20 > + if path: >=20 > + self.last_dir =3D os.path.dirname(path) >=20 > + fd =3D open(path, 'rb') >=20 > + bins =3D bytearray(fd.read())[:self.size] >=20 > + fd.close() >=20 > + bins.extend(b'\x00' * (self.size - len(bins))) >=20 > + return bins >=20 > + >=20 > + return None >=20 > + >=20 > + def click(self, event): >=20 > + row_id =3D self.identify_row(event.y) >=20 > + col_id =3D self.identify_column(event.x) >=20 > + if row_id =3D=3D '' and col_id =3D=3D '#1': >=20 > + # Clicked on "LOAD" cell >=20 > + bins =3D self.load_bin() >=20 > + self.refresh_bin(bins) >=20 > + return >=20 > + >=20 > + if col_id =3D=3D '#1': >=20 > + # Clicked on column 1(Offset column) >=20 > + return >=20 > + >=20 > + item =3D self.identify('item', event.x, event.y) >=20 > + if not item or not col_id: >=20 > + # Not clicked on valid cell >=20 > + return >=20 > + >=20 > + # Clicked cell >=20 > + row =3D int('0x' + row_id[1:], 0) - 1 >=20 > + col =3D int(col_id[1:]) - 1 >=20 > + if row * self.cols + col > self.size: >=20 > + return >=20 > + >=20 > + vals =3D self.item(item, 'values') >=20 > + if col < len(vals): >=20 > + txt =3D self.item(item, 'values')[col] >=20 > + self.entry.display(txt, row_id, col_id) >=20 > + >=20 > + def get(self): >=20 > + bins =3D bytearray() >=20 > + row_ids =3D self.get_children() >=20 > + for row_id in row_ids: >=20 > + row =3D int('0x' + row_id[1:], 0) - 1 >=20 > + for col in range(self.cols): >=20 > + idx =3D row * sum(self.col_byte_len) + \ >=20 > + sum(self.col_byte_len[:col]) >=20 > + byte_len =3D self.col_byte_len[col] >=20 > + if idx + byte_len > self.size: >=20 > + break >=20 > + hex =3D self.item(row_id, 'values')[col + 1] >=20 > + values =3D value_to_bytes(int(hex, 16) >=20 > + & ((1 << byte_len * 8) - 1), byt= e_len) >=20 > + bins.extend(values) >=20 > + return bins >=20 > + >=20 > + >=20 > +class c_uint24(Structure): >=20 > + """Little-Endian 24-bit Unsigned Integer""" >=20 > + _pack_ =3D 1 >=20 > + _fields_ =3D [('Data', (c_uint8 * 3))] >=20 > + >=20 > + def __init__(self, val=3D0): >=20 > + self.set_value(val) >=20 > + >=20 > + def __str__(self, indent=3D0): >=20 > + return '0x%.6x' % self.value >=20 > + >=20 > + def __int__(self): >=20 > + return self.get_value() >=20 > + >=20 > + def set_value(self, val): >=20 > + self.Data[0:3] =3D Val2Bytes(val, 3) >=20 > + >=20 > + def get_value(self): >=20 > + return Bytes2Val(self.Data[0:3]) >=20 > + >=20 > + value =3D property(get_value, set_value) >=20 > + >=20 > + >=20 > +class EFI_FIRMWARE_VOLUME_HEADER(Structure): >=20 > + _fields_ =3D [ >=20 > + ('ZeroVector', ARRAY(c_uint8, 16)), >=20 > + ('FileSystemGuid', ARRAY(c_uint8, 16)), >=20 > + ('FvLength', c_uint64), >=20 > + ('Signature', ARRAY(c_char, 4)), >=20 > + ('Attributes', c_uint32), >=20 > + ('HeaderLength', c_uint16), >=20 > + ('Checksum', c_uint16), >=20 > + ('ExtHeaderOffset', c_uint16), >=20 > + ('Reserved', c_uint8), >=20 > + ('Revision', c_uint8) >=20 > + ] >=20 > + >=20 > + >=20 > +class EFI_FIRMWARE_VOLUME_EXT_HEADER(Structure): >=20 > + _fields_ =3D [ >=20 > + ('FvName', ARRAY(c_uint8, 16)), >=20 > + ('ExtHeaderSize', c_uint32) >=20 > + ] >=20 > + >=20 > + >=20 > +class EFI_FFS_INTEGRITY_CHECK(Structure): >=20 > + _fields_ =3D [ >=20 > + ('Header', c_uint8), >=20 > + ('File', c_uint8) >=20 > + ] >=20 > + >=20 > + >=20 > +class EFI_FFS_FILE_HEADER(Structure): >=20 > + _fields_ =3D [ >=20 > + ('Name', ARRAY(c_uint8, 16)), >=20 > + ('IntegrityCheck', EFI_FFS_INTEGRITY_CHECK), >=20 > + ('Type', c_uint8), >=20 > + ('Attributes', c_uint8), >=20 > + ('Size', c_uint24), >=20 > + ('State', c_uint8) >=20 > + ] >=20 > + >=20 > + >=20 > +class EFI_COMMON_SECTION_HEADER(Structure): >=20 > + _fields_ =3D [ >=20 > + ('Size', c_uint24), >=20 > + ('Type', c_uint8) >=20 > + ] >=20 > + >=20 > + >=20 > +class EFI_SECTION_TYPE: >=20 > + """Enumeration of all valid firmware file section types.""" >=20 > + ALL =3D 0x00 >=20 > + COMPRESSION =3D 0x01 >=20 > + GUID_DEFINED =3D 0x02 >=20 > + DISPOSABLE =3D 0x03 >=20 > + PE32 =3D 0x10 >=20 > + PIC =3D 0x11 >=20 > + TE =3D 0x12 >=20 > + DXE_DEPEX =3D 0x13 >=20 > + VERSION =3D 0x14 >=20 > + USER_INTERFACE =3D 0x15 >=20 > + COMPATIBILITY16 =3D 0x16 >=20 > + FIRMWARE_VOLUME_IMAGE =3D 0x17 >=20 > + FREEFORM_SUBTYPE_GUID =3D 0x18 >=20 > + RAW =3D 0x19 >=20 > + PEI_DEPEX =3D 0x1b >=20 > + SMM_DEPEX =3D 0x1c >=20 > + >=20 > + >=20 > +class FSP_COMMON_HEADER(Structure): >=20 > + _fields_ =3D [ >=20 > + ('Signature', ARRAY(c_char, 4)), >=20 > + ('HeaderLength', c_uint32) >=20 > + ] >=20 > + >=20 > + >=20 > +class FSP_INFORMATION_HEADER(Structure): >=20 > + _fields_ =3D [ >=20 > + ('Signature', ARRAY(c_char, 4)), >=20 > + ('HeaderLength', c_uint32), >=20 > + ('Reserved1', c_uint16), >=20 > + ('SpecVersion', c_uint8), >=20 > + ('HeaderRevision', c_uint8), >=20 > + ('ImageRevision', c_uint32), >=20 > + ('ImageId', ARRAY(c_char, 8)), >=20 > + ('ImageSize', c_uint32), >=20 > + ('ImageBase', c_uint32), >=20 > + ('ImageAttribute', c_uint16), >=20 > + ('ComponentAttribute', c_uint16), >=20 > + ('CfgRegionOffset', c_uint32), >=20 > + ('CfgRegionSize', c_uint32), >=20 > + ('Reserved2', c_uint32), >=20 > + ('TempRamInitEntryOffset', c_uint32), >=20 > + ('Reserved3', c_uint32), >=20 > + ('NotifyPhaseEntryOffset', c_uint32), >=20 > + ('FspMemoryInitEntryOffset', c_uint32), >=20 > + ('TempRamExitEntryOffset', c_uint32), >=20 > + ('FspSiliconInitEntryOffset', c_uint32) >=20 > + ] >=20 > + >=20 > + >=20 > +class FSP_EXTENDED_HEADER(Structure): >=20 > + _fields_ =3D [ >=20 > + ('Signature', ARRAY(c_char, 4)), >=20 > + ('HeaderLength', c_uint32), >=20 > + ('Revision', c_uint8), >=20 > + ('Reserved', c_uint8), >=20 > + ('FspProducerId', ARRAY(c_char, 6)), >=20 > + ('FspProducerRevision', c_uint32), >=20 > + ('FspProducerDataSize', c_uint32) >=20 > + ] >=20 > + >=20 > + >=20 > +class FSP_PATCH_TABLE(Structure): >=20 > + _fields_ =3D [ >=20 > + ('Signature', ARRAY(c_char, 4)), >=20 > + ('HeaderLength', c_uint16), >=20 > + ('HeaderRevision', c_uint8), >=20 > + ('Reserved', c_uint8), >=20 > + ('PatchEntryNum', c_uint32) >=20 > + ] >=20 > + >=20 > + >=20 > +class Section: >=20 > + def __init__(self, offset, secdata): >=20 > + self.SecHdr =3D EFI_COMMON_SECTION_HEADER.from_buffer(secdata, 0= ) >=20 > + self.SecData =3D secdata[0:int(self.SecHdr.Size)] >=20 > + self.Offset =3D offset >=20 > + >=20 > + >=20 > +def AlignPtr(offset, alignment=3D8): >=20 > + return (offset + alignment - 1) & ~(alignment - 1) >=20 > + >=20 > + >=20 > +def Bytes2Val(bytes): >=20 > + return reduce(lambda x, y: (x << 8) | y, bytes[:: -1]) >=20 > + >=20 > + >=20 > +def Val2Bytes(value, blen): >=20 > + return [(value >> (i*8) & 0xff) for i in range(blen)] >=20 > + >=20 > + >=20 > +class FirmwareFile: >=20 > + def __init__(self, offset, filedata): >=20 > + self.FfsHdr =3D EFI_FFS_FILE_HEADER.from_buffer(filedata, 0) >=20 > + self.FfsData =3D filedata[0:int(self.FfsHdr.Size)] >=20 > + self.Offset =3D offset >=20 > + self.SecList =3D [] >=20 > + >=20 > + def ParseFfs(self): >=20 > + ffssize =3D len(self.FfsData) >=20 > + offset =3D sizeof(self.FfsHdr) >=20 > + if self.FfsHdr.Name !=3D '\xff' * 16: >=20 > + while offset < (ffssize - sizeof(EFI_COMMON_SECTION_HEADER))= : >=20 > + sechdr =3D EFI_COMMON_SECTION_HEADER.from_buffer( >=20 > + self.FfsData, offset) >=20 > + sec =3D Section( >=20 > + offset, self.FfsData[offset:offset + int(sechdr.Size= )]) >=20 > + self.SecList.append(sec) >=20 > + offset +=3D int(sechdr.Size) >=20 > + offset =3D AlignPtr(offset, 4) >=20 > + >=20 > + >=20 > +class FirmwareVolume: >=20 > + def __init__(self, offset, fvdata): >=20 > + self.FvHdr =3D EFI_FIRMWARE_VOLUME_HEADER.from_buffer(fvdata, 0) >=20 > + self.FvData =3D fvdata[0: self.FvHdr.FvLength] >=20 > + self.Offset =3D offset >=20 > + if self.FvHdr.ExtHeaderOffset > 0: >=20 > + self.FvExtHdr =3D EFI_FIRMWARE_VOLUME_EXT_HEADER.from_buffer= ( >=20 > + self.FvData, self.FvHdr.ExtHeaderOffset) >=20 > + else: >=20 > + self.FvExtHdr =3D None >=20 > + self.FfsList =3D [] >=20 > + >=20 > + def ParseFv(self): >=20 > + fvsize =3D len(self.FvData) >=20 > + if self.FvExtHdr: >=20 > + offset =3D self.FvHdr.ExtHeaderOffset + self.FvExtHdr.ExtHea= derSize >=20 > + else: >=20 > + offset =3D self.FvHdr.HeaderLength >=20 > + offset =3D AlignPtr(offset) >=20 > + while offset < (fvsize - sizeof(EFI_FFS_FILE_HEADER)): >=20 > + ffshdr =3D EFI_FFS_FILE_HEADER.from_buffer(self.FvData, offs= et) >=20 > + if (ffshdr.Name =3D=3D '\xff' * 16) and \ >=20 > + (int(ffshdr.Size) =3D=3D 0xFFFFFF): >=20 > + offset =3D fvsize >=20 > + else: >=20 > + ffs =3D FirmwareFile( >=20 > + offset, self.FvData[offset:offset + int(ffshdr.Size)= ]) >=20 > + ffs.ParseFfs() >=20 > + self.FfsList.append(ffs) >=20 > + offset +=3D int(ffshdr.Size) >=20 > + offset =3D AlignPtr(offset) >=20 > + >=20 > + >=20 > +class FspImage: >=20 > + def __init__(self, offset, fih, fihoff, patch): >=20 > + self.Fih =3D fih >=20 > + self.FihOffset =3D fihoff >=20 > + self.Offset =3D offset >=20 > + self.FvIdxList =3D [] >=20 > + self.Type =3D "XTMSXXXXOXXXXXXX"[(fih.ComponentAttribute >> 12) = & 0x0F] >=20 > + self.PatchList =3D patch >=20 > + self.PatchList.append(fihoff + 0x1C) >=20 > + >=20 > + def AppendFv(self, FvIdx): >=20 > + self.FvIdxList.append(FvIdx) >=20 > + >=20 > + def Patch(self, delta, fdbin): >=20 > + count =3D 0 >=20 > + applied =3D 0 >=20 > + for idx, patch in enumerate(self.PatchList): >=20 > + ptype =3D (patch >> 24) & 0x0F >=20 > + if ptype not in [0x00, 0x0F]: >=20 > + raise Exception('ERROR: Invalid patch type %d !' % ptype= ) >=20 > + if patch & 0x80000000: >=20 > + patch =3D self.Fih.ImageSize - (0x1000000 - (patch & 0xF= FFFFF)) >=20 > + else: >=20 > + patch =3D patch & 0xFFFFFF >=20 > + if (patch < self.Fih.ImageSize) and \ >=20 > + (patch + sizeof(c_uint32) <=3D self.Fih.ImageSize): >=20 > + offset =3D patch + self.Offset >=20 > + value =3D Bytes2Val(fdbin[offset:offset+sizeof(c_uint32)= ]) >=20 > + value +=3D delta >=20 > + fdbin[offset:offset+sizeof(c_uint32)] =3D Val2Bytes( >=20 > + value, sizeof(c_uint32)) >=20 > + applied +=3D 1 >=20 > + count +=3D 1 >=20 > + # Don't count the FSP base address patch entry appended at the e= nd >=20 > + if count !=3D 0: >=20 > + count -=3D 1 >=20 > + applied -=3D 1 >=20 > + return (count, applied) >=20 > + >=20 > + >=20 > +class FirmwareDevice: >=20 > + def __init__(self, offset, FdData): >=20 > + self.FvList =3D [] >=20 > + self.FspList =3D [] >=20 > + self.FspExtList =3D [] >=20 > + self.FihList =3D [] >=20 > + self.BuildList =3D [] >=20 > + self.OutputText =3D "" >=20 > + self.Offset =3D 0 >=20 > + self.FdData =3D FdData >=20 > + >=20 > + def ParseFd(self): >=20 > + offset =3D 0 >=20 > + fdsize =3D len(self.FdData) >=20 > + self.FvList =3D [] >=20 > + while offset < (fdsize - sizeof(EFI_FIRMWARE_VOLUME_HEADER)): >=20 > + fvh =3D EFI_FIRMWARE_VOLUME_HEADER.from_buffer(self.FdData, > offset) >=20 > + if b'_FVH' !=3D fvh.Signature: >=20 > + raise Exception("ERROR: Invalid FV header !") >=20 > + fv =3D FirmwareVolume( >=20 > + offset, self.FdData[offset:offset + fvh.FvLength]) >=20 > + fv.ParseFv() >=20 > + self.FvList.append(fv) >=20 > + offset +=3D fv.FvHdr.FvLength >=20 > + >=20 > + def CheckFsp(self): >=20 > + if len(self.FspList) =3D=3D 0: >=20 > + return >=20 > + >=20 > + fih =3D None >=20 > + for fsp in self.FspList: >=20 > + if not fih: >=20 > + fih =3D fsp.Fih >=20 > + else: >=20 > + newfih =3D fsp.Fih >=20 > + if (newfih.ImageId !=3D fih.ImageId) or \ >=20 > + (newfih.ImageRevision !=3D fih.ImageRevision): >=20 > + raise Exception( >=20 > + "ERROR: Inconsistent FSP ImageId or " >=20 > + "ImageRevision detected !") >=20 > + >=20 > + def ParseFsp(self): >=20 > + flen =3D 0 >=20 > + for idx, fv in enumerate(self.FvList): >=20 > + # Check if this FV contains FSP header >=20 > + if flen =3D=3D 0: >=20 > + if len(fv.FfsList) =3D=3D 0: >=20 > + continue >=20 > + ffs =3D fv.FfsList[0] >=20 > + if len(ffs.SecList) =3D=3D 0: >=20 > + continue >=20 > + sec =3D ffs.SecList[0] >=20 > + if sec.SecHdr.Type !=3D EFI_SECTION_TYPE.RAW: >=20 > + continue >=20 > + fihoffset =3D ffs.Offset + sec.Offset + sizeof(sec.SecHd= r) >=20 > + fspoffset =3D fv.Offset >=20 > + offset =3D fspoffset + fihoffset >=20 > + fih =3D FSP_INFORMATION_HEADER.from_buffer(self.FdData, = offset) >=20 > + self.FihList.append(fih) >=20 > + if b'FSPH' !=3D fih.Signature: >=20 > + continue >=20 > + >=20 > + offset +=3D fih.HeaderLength >=20 > + >=20 > + offset =3D AlignPtr(offset, 2) >=20 > + Extfih =3D FSP_EXTENDED_HEADER.from_buffer(self.FdData, = offset) >=20 > + self.FspExtList.append(Extfih) >=20 > + offset =3D AlignPtr(offset, 4) >=20 > + plist =3D [] >=20 > + while True: >=20 > + fch =3D FSP_COMMON_HEADER.from_buffer(self.FdData, o= ffset) >=20 > + if b'FSPP' !=3D fch.Signature: >=20 > + offset +=3D fch.HeaderLength >=20 > + offset =3D AlignPtr(offset, 4) >=20 > + else: >=20 > + fspp =3D FSP_PATCH_TABLE.from_buffer( >=20 > + self.FdData, offset) >=20 > + offset +=3D sizeof(fspp) >=20 > + start_offset =3D offset + 32 >=20 > + end_offset =3D offset + 32 >=20 > + while True: >=20 > + end_offset +=3D 1 >=20 > + if(self.FdData[ >=20 > + end_offset: end_offset + 1] =3D=3D b= '\xff'): >=20 > + break >=20 > + self.BuildList.append( >=20 > + self.FdData[start_offset:end_offset]) >=20 > + pdata =3D (c_uint32 * fspp.PatchEntryNum).from_b= uffer( >=20 > + self.FdData, offset) >=20 > + plist =3D list(pdata) >=20 > + break >=20 > + >=20 > + fsp =3D FspImage(fspoffset, fih, fihoffset, plist) >=20 > + fsp.AppendFv(idx) >=20 > + self.FspList.append(fsp) >=20 > + flen =3D fsp.Fih.ImageSize - fv.FvHdr.FvLength >=20 > + else: >=20 > + fsp.AppendFv(idx) >=20 > + flen -=3D fv.FvHdr.FvLength >=20 > + if flen < 0: >=20 > + raise Exception("ERROR: Incorrect FV size in image != ") >=20 > + self.CheckFsp() >=20 > + >=20 > + def OutputFsp(self): >=20 > + def copy_text_to_clipboard(): >=20 > + window.clipboard_clear() >=20 > + window.clipboard_append(self.OutputText) >=20 > + >=20 > + window =3D tkinter.Tk() >=20 > + window.title("Fsp Headers") >=20 > + window.resizable(0, 0) >=20 > + # Window Size >=20 > + window.geometry("300x400+350+150") >=20 > + frame =3D tkinter.Frame(window) >=20 > + frame.pack(side=3Dtkinter.BOTTOM) >=20 > + # Vertical (y) Scroll Bar >=20 > + scroll =3D tkinter.Scrollbar(window) >=20 > + scroll.pack(side=3Dtkinter.RIGHT, fill=3Dtkinter.Y) >=20 > + text =3D tkinter.Text(window, >=20 > + wrap=3Dtkinter.NONE, yscrollcommand=3Dscroll= .set) >=20 > + i =3D 0 >=20 > + self.OutputText =3D self.OutputText + "Fsp Header Details \n\n" >=20 > + while i < len(self.FihList): >=20 > + self.OutputText +=3D str(self.BuildList[i].decode('utf-8')) = + "\n" >=20 > + self.OutputText +=3D "FSP Header :\n " >=20 > + self.OutputText +=3D "Signature : " + \ >=20 > + str(self.FihList[i].Signature.decode('utf-8')) + "\n " >=20 > + self.OutputText +=3D "Header Length : " + \ >=20 > + str(hex(self.FihList[i].HeaderLength)) + "\n " >=20 > + self.OutputText +=3D "Header Revision : " + \ >=20 > + str(hex(self.FihList[i].HeaderRevision)) + "\n " >=20 > + self.OutputText +=3D "Spec Version : " + \ >=20 > + str(hex(self.FihList[i].SpecVersion)) + "\n " >=20 > + self.OutputText +=3D "Image Revision : " + \ >=20 > + str(hex(self.FihList[i].ImageRevision)) + "\n " >=20 > + self.OutputText +=3D "Image Id : " + \ >=20 > + str(self.FihList[i].ImageId.decode('utf-8')) + "\n " >=20 > + self.OutputText +=3D "Image Size : " + \ >=20 > + str(hex(self.FihList[i].ImageSize)) + "\n " >=20 > + self.OutputText +=3D "Image Base : " + \ >=20 > + str(hex(self.FihList[i].ImageBase)) + "\n " >=20 > + self.OutputText +=3D "Image Attribute : " + \ >=20 > + str(hex(self.FihList[i].ImageAttribute)) + "\n " >=20 > + self.OutputText +=3D "Cfg Region Offset : " + \ >=20 > + str(hex(self.FihList[i].CfgRegionOffset)) + "\n " >=20 > + self.OutputText +=3D "Cfg Region Size : " + \ >=20 > + str(hex(self.FihList[i].CfgRegionSize)) + "\n " >=20 > + self.OutputText +=3D "API Entry Num : " + \ >=20 > + str(hex(self.FihList[i].Reserved2)) + "\n " >=20 > + self.OutputText +=3D "Temp Ram Init Entry : " + \ >=20 > + str(hex(self.FihList[i].TempRamInitEntryOffset)) + "\n " >=20 > + self.OutputText +=3D "FSP Init Entry : " + \ >=20 > + str(hex(self.FihList[i].Reserved3)) + "\n " >=20 > + self.OutputText +=3D "Notify Phase Entry : " + \ >=20 > + str(hex(self.FihList[i].NotifyPhaseEntryOffset)) + "\n " >=20 > + self.OutputText +=3D "Fsp Memory Init Entry : " + \ >=20 > + str(hex(self.FihList[i].FspMemoryInitEntryOffset)) + "\n= " >=20 > + self.OutputText +=3D "Temp Ram Exit Entry : " + \ >=20 > + str(hex(self.FihList[i].TempRamExitEntryOffset)) + "\n " >=20 > + self.OutputText +=3D "Fsp Silicon Init Entry : " + \ >=20 > + str(hex(self.FihList[i].FspSiliconInitEntryOffset)) + "\= n\n" >=20 > + self.OutputText +=3D "FSP Extended Header:\n " >=20 > + self.OutputText +=3D "Signature : " + \ >=20 > + str(self.FspExtList[i].Signature.decode('utf-8')) + "\n = " >=20 > + self.OutputText +=3D "Header Length : " + \ >=20 > + str(hex(self.FspExtList[i].HeaderLength)) + "\n " >=20 > + self.OutputText +=3D "Header Revision : " + \ >=20 > + str(hex(self.FspExtList[i].Revision)) + "\n " >=20 > + self.OutputText +=3D "Fsp Producer Id : " + \ >=20 > + str(self.FspExtList[i].FspProducerId.decode('utf-8')) + = "\n " >=20 > + self.OutputText +=3D "FspProducerRevision : " + \ >=20 > + str(hex(self.FspExtList[i].FspProducerRevision)) + "\n\n= " >=20 > + i +=3D 1 >=20 > + text.insert(tkinter.INSERT, self.OutputText) >=20 > + text.pack() >=20 > + # Configure the scrollbars >=20 > + scroll.config(command=3Dtext.yview) >=20 > + copy_button =3D tkinter.Button( >=20 > + window, text=3D"Copy to Clipboard", command=3Dcopy_text_to_c= lipboard) >=20 > + copy_button.pack(in_=3Dframe, side=3Dtkinter.LEFT, padx=3D20, pa= dy=3D10) >=20 > + exit_button =3D tkinter.Button( >=20 > + window, text=3D"Close", command=3Dwindow.destroy) >=20 > + exit_button.pack(in_=3Dframe, side=3Dtkinter.RIGHT, padx=3D20, p= ady=3D10) >=20 > + window.mainloop() >=20 > + >=20 > + >=20 > +class state: >=20 > + def __init__(self): >=20 > + self.state =3D False >=20 > + >=20 > + def set(self, value): >=20 > + self.state =3D value >=20 > + >=20 > + def get(self): >=20 > + return self.state >=20 > + >=20 > + >=20 > +class application(tkinter.Frame): >=20 > + def __init__(self, master=3DNone): >=20 > + root =3D master >=20 > + >=20 > + self.debug =3D True >=20 > + self.mode =3D 'FSP' >=20 > + self.last_dir =3D '.' >=20 > + self.page_id =3D '' >=20 > + self.page_list =3D {} >=20 > + self.conf_list =3D {} >=20 > + self.cfg_data_obj =3D None >=20 > + self.org_cfg_data_bin =3D None >=20 > + self.in_left =3D state() >=20 > + self.in_right =3D state() >=20 > + >=20 > + # Check if current directory contains a file with a .yaml extens= ion >=20 > + # if not default self.last_dir to a Platform directory where it = is >=20 > + # easier to locate *BoardPkg\CfgData\*Def.yaml files >=20 > + self.last_dir =3D '.' >=20 > + if not any(fname.endswith('.yaml') for fname in os.listdir('.'))= : >=20 > + platform_path =3D Path(os.path.realpath(__file__)).parents[2= ].\ >=20 > + joinpath('Platform') >=20 > + if platform_path.exists(): >=20 > + self.last_dir =3D platform_path >=20 > + >=20 > + tkinter.Frame.__init__(self, master, borderwidth=3D2) >=20 > + >=20 > + self.menu_string =3D [ >=20 > + 'Save Config Data to Binary', 'Load Config Data from Binary'= , >=20 > + 'Show Binary Configuration', >=20 > + 'Load Config Changes from Delta File', >=20 > + 'Save Config Changes to Delta File', >=20 > + 'Save Full Config Data to Delta File', >=20 > + 'Open Config BSF file' >=20 > + ] >=20 > + >=20 > + root.geometry("1200x800") >=20 > + >=20 > + paned =3D ttk.Panedwindow(root, orient=3Dtkinter.HORIZONTAL) >=20 > + paned.pack(fill=3Dtkinter.BOTH, expand=3DTrue, padx=3D(4, 4)) >=20 > + >=20 > + status =3D tkinter.Label(master, text=3D"", bd=3D1, relief=3Dtki= nter.SUNKEN, >=20 > + anchor=3Dtkinter.W) >=20 > + status.pack(side=3Dtkinter.BOTTOM, fill=3Dtkinter.X) >=20 > + >=20 > + frame_left =3D ttk.Frame(paned, height=3D800, relief=3D"groove") >=20 > + >=20 > + self.left =3D ttk.Treeview(frame_left, show=3D"tree") >=20 > + >=20 > + # Set up tree HScroller >=20 > + pady =3D (10, 10) >=20 > + self.tree_scroll =3D ttk.Scrollbar(frame_left, >=20 > + orient=3D"vertical", >=20 > + command=3Dself.left.yview) >=20 > + self.left.configure(yscrollcommand=3Dself.tree_scroll.set) >=20 > + self.left.bind("<>", self.on_config_page_select_= change) >=20 > + self.left.bind("", lambda e: self.in_left.set(True)) >=20 > + self.left.bind("", lambda e: self.in_left.set(False)) >=20 > + self.left.bind("", self.on_tree_scroll) >=20 > + >=20 > + self.left.pack(side=3D'left', >=20 > + fill=3Dtkinter.BOTH, >=20 > + expand=3DTrue, >=20 > + padx=3D(5, 0), >=20 > + pady=3Dpady) >=20 > + self.tree_scroll.pack(side=3D'right', fill=3Dtkinter.Y, >=20 > + pady=3Dpady, padx=3D(0, 5)) >=20 > + >=20 > + frame_right =3D ttk.Frame(paned, relief=3D"groove") >=20 > + self.frame_right =3D frame_right >=20 > + >=20 > + self.conf_canvas =3D tkinter.Canvas(frame_right, highlightthickn= ess=3D0) >=20 > + self.page_scroll =3D ttk.Scrollbar(frame_right, >=20 > + orient=3D"vertical", >=20 > + command=3Dself.conf_canvas.yvie= w) >=20 > + self.right_grid =3D ttk.Frame(self.conf_canvas) >=20 > + self.conf_canvas.configure(yscrollcommand=3Dself.page_scroll.set= ) >=20 > + self.conf_canvas.pack(side=3D'left', >=20 > + fill=3Dtkinter.BOTH, >=20 > + expand=3DTrue, >=20 > + pady=3Dpady, >=20 > + padx=3D(5, 0)) >=20 > + self.page_scroll.pack(side=3D'right', fill=3Dtkinter.Y, >=20 > + pady=3Dpady, padx=3D(0, 5)) >=20 > + self.conf_canvas.create_window(0, 0, window=3Dself.right_grid, >=20 > + anchor=3D'nw') >=20 > + self.conf_canvas.bind('', lambda e: self.in_right.set(Tru= e)) >=20 > + self.conf_canvas.bind('', lambda e: self.in_right.set(Fal= se)) >=20 > + self.conf_canvas.bind("", self.on_canvas_configure) >=20 > + self.conf_canvas.bind_all("", self.on_page_scroll) >=20 > + >=20 > + paned.add(frame_left, weight=3D2) >=20 > + paned.add(frame_right, weight=3D10) >=20 > + >=20 > + style =3D ttk.Style() >=20 > + style.layout("Treeview", [('Treeview.treearea', {'sticky': 'nswe= '})]) >=20 > + >=20 > + menubar =3D tkinter.Menu(root) >=20 > + file_menu =3D tkinter.Menu(menubar, tearoff=3D0) >=20 > + file_menu.add_command(label=3D"Open Config YAML file", >=20 > + command=3Dself.load_from_yaml) >=20 > + file_menu.add_command(label=3Dself.menu_string[6], >=20 > + command=3Dself.load_from_bsf_file) >=20 > + file_menu.add_command(label=3Dself.menu_string[2], >=20 > + command=3Dself.load_from_fd) >=20 > + file_menu.add_command(label=3Dself.menu_string[0], >=20 > + command=3Dself.save_to_bin, >=20 > + state=3D'disabled') >=20 > + file_menu.add_command(label=3Dself.menu_string[1], >=20 > + command=3Dself.load_from_bin, >=20 > + state=3D'disabled') >=20 > + file_menu.add_command(label=3Dself.menu_string[3], >=20 > + command=3Dself.load_from_delta, >=20 > + state=3D'disabled') >=20 > + file_menu.add_command(label=3Dself.menu_string[4], >=20 > + command=3Dself.save_to_delta, >=20 > + state=3D'disabled') >=20 > + file_menu.add_command(label=3Dself.menu_string[5], >=20 > + command=3Dself.save_full_to_delta, >=20 > + state=3D'disabled') >=20 > + file_menu.add_command(label=3D"About", command=3Dself.about) >=20 > + menubar.add_cascade(label=3D"File", menu=3Dfile_menu) >=20 > + self.file_menu =3D file_menu >=20 > + >=20 > + root.config(menu=3Dmenubar) >=20 > + >=20 > + if len(sys.argv) > 1: >=20 > + path =3D sys.argv[1] >=20 > + if not path.endswith('.yaml') and not path.endswith('.pkl'): >=20 > + messagebox.showerror('LOADING ERROR', >=20 > + "Unsupported file '%s' !" % path) >=20 > + return >=20 > + else: >=20 > + self.load_cfg_file(path) >=20 > + >=20 > + if len(sys.argv) > 2: >=20 > + path =3D sys.argv[2] >=20 > + if path.endswith('.dlt'): >=20 > + self.load_delta_file(path) >=20 > + elif path.endswith('.bin'): >=20 > + self.load_bin_file(path) >=20 > + else: >=20 > + messagebox.showerror('LOADING ERROR', >=20 > + "Unsupported file '%s' !" % path) >=20 > + return >=20 > + >=20 > + def set_object_name(self, widget, name): >=20 > + self.conf_list[id(widget)] =3D name >=20 > + >=20 > + def get_object_name(self, widget): >=20 > + if id(widget) in self.conf_list: >=20 > + return self.conf_list[id(widget)] >=20 > + else: >=20 > + return None >=20 > + >=20 > + def limit_entry_size(self, variable, limit): >=20 > + value =3D variable.get() >=20 > + if len(value) > limit: >=20 > + variable.set(value[:limit]) >=20 > + >=20 > + def on_canvas_configure(self, event): >=20 > + self.right_grid.grid_columnconfigure(0, minsize=3Devent.width) >=20 > + >=20 > + def on_tree_scroll(self, event): >=20 > + if not self.in_left.get() and self.in_right.get(): >=20 > + # This prevents scroll event from being handled by both left= and >=20 > + # right frame at the same time. >=20 > + self.on_page_scroll(event) >=20 > + return 'break' >=20 > + >=20 > + def on_page_scroll(self, event): >=20 > + if self.in_right.get(): >=20 > + # Only scroll when it is in active area >=20 > + min, max =3D self.page_scroll.get() >=20 > + if not((min =3D=3D 0.0) and (max =3D=3D 1.0)): >=20 > + self.conf_canvas.yview_scroll(-1 * int(event.delta / 120= ), >=20 > + 'units') >=20 > + >=20 > + def update_visibility_for_widget(self, widget, args): >=20 > + >=20 > + visible =3D True >=20 > + item =3D self.get_config_data_item_from_widget(widget, True) >=20 > + if item is None: >=20 > + return visible >=20 > + elif not item: >=20 > + return visible >=20 > + >=20 > + result =3D 1 >=20 > + if item['condition']: >=20 > + result =3D self.evaluate_condition(item) >=20 > + if result =3D=3D 2: >=20 > + # Gray >=20 > + widget.configure(state=3D'disabled') >=20 > + elif result =3D=3D 0: >=20 > + # Hide >=20 > + visible =3D False >=20 > + widget.grid_remove() >=20 > + else: >=20 > + # Show >=20 > + widget.grid() >=20 > + widget.configure(state=3D'normal') >=20 > + >=20 > + return visible >=20 > + >=20 > + def update_widgets_visibility_on_page(self): >=20 > + self.walk_widgets_in_layout(self.right_grid, >=20 > + self.update_visibility_for_widget) >=20 > + >=20 > + def combo_select_changed(self, event): >=20 > + self.update_config_data_from_widget(event.widget, None) >=20 > + self.update_widgets_visibility_on_page() >=20 > + >=20 > + def edit_num_finished(self, event): >=20 > + widget =3D event.widget >=20 > + item =3D self.get_config_data_item_from_widget(widget) >=20 > + if not item: >=20 > + return >=20 > + parts =3D item['type'].split(',') >=20 > + if len(parts) > 3: >=20 > + min =3D parts[2].lstrip()[1:] >=20 > + max =3D parts[3].rstrip()[:-1] >=20 > + min_val =3D array_str_to_value(min) >=20 > + max_val =3D array_str_to_value(max) >=20 > + text =3D widget.get() >=20 > + if ',' in text: >=20 > + text =3D '{ %s }' % text >=20 > + try: >=20 > + value =3D array_str_to_value(text) >=20 > + if value < min_val or value > max_val: >=20 > + raise Exception('Invalid input!') >=20 > + self.set_config_item_value(item, text) >=20 > + except Exception: >=20 > + pass >=20 > + >=20 > + text =3D item['value'].strip('{').strip('}').strip() >=20 > + widget.delete(0, tkinter.END) >=20 > + widget.insert(0, text) >=20 > + >=20 > + self.update_widgets_visibility_on_page() >=20 > + >=20 > + def update_page_scroll_bar(self): >=20 > + # Update scrollbar >=20 > + self.frame_right.update() >=20 > + self.conf_canvas.config(scrollregion=3Dself.conf_canvas.bbox("al= l")) >=20 > + >=20 > + def on_config_page_select_change(self, event): >=20 > + self.update_config_data_on_page() >=20 > + sel =3D self.left.selection() >=20 > + if len(sel) > 0: >=20 > + page_id =3D sel[0] >=20 > + self.build_config_data_page(page_id) >=20 > + self.update_widgets_visibility_on_page() >=20 > + self.update_page_scroll_bar() >=20 > + >=20 > + def walk_widgets_in_layout(self, parent, callback_function, args=3DN= one): >=20 > + for widget in parent.winfo_children(): >=20 > + callback_function(widget, args) >=20 > + >=20 > + def clear_widgets_inLayout(self, parent=3DNone): >=20 > + if parent is None: >=20 > + parent =3D self.right_grid >=20 > + >=20 > + for widget in parent.winfo_children(): >=20 > + widget.destroy() >=20 > + >=20 > + parent.grid_forget() >=20 > + self.conf_list.clear() >=20 > + >=20 > + def build_config_page_tree(self, cfg_page, parent): >=20 > + for page in cfg_page['child']: >=20 > + page_id =3D next(iter(page)) >=20 > + # Put CFG items into related page list >=20 > + self.page_list[page_id] =3D self.cfg_data_obj.get_cfg_list(p= age_id) >=20 > + self.page_list[page_id].sort(key=3Dlambda x: x['order']) >=20 > + page_name =3D self.cfg_data_obj.get_page_title(page_id) >=20 > + child =3D self.left.insert( >=20 > + parent, 'end', >=20 > + iid=3Dpage_id, text=3Dpage_name, >=20 > + value=3D0) >=20 > + if len(page[page_id]) > 0: >=20 > + self.build_config_page_tree(page[page_id], child) >=20 > + >=20 > + def is_config_data_loaded(self): >=20 > + return True if len(self.page_list) else False >=20 > + >=20 > + def set_current_config_page(self, page_id): >=20 > + self.page_id =3D page_id >=20 > + >=20 > + def get_current_config_page(self): >=20 > + return self.page_id >=20 > + >=20 > + def get_current_config_data(self): >=20 > + page_id =3D self.get_current_config_page() >=20 > + if page_id in self.page_list: >=20 > + return self.page_list[page_id] >=20 > + else: >=20 > + return [] >=20 > + >=20 > + def build_config_data_page(self, page_id): >=20 > + self.clear_widgets_inLayout() >=20 > + self.set_current_config_page(page_id) >=20 > + disp_list =3D [] >=20 > + for item in self.get_current_config_data(): >=20 > + disp_list.append(item) >=20 > + row =3D 0 >=20 > + disp_list.sort(key=3Dlambda x: x['order']) >=20 > + for item in disp_list: >=20 > + self.add_config_item(item, row) >=20 > + row +=3D 2 >=20 > + >=20 > + def load_config_data(self, file_name): >=20 > + gen_cfg_data =3D CGenYamlCfg() >=20 > + if file_name.endswith('.pkl'): >=20 > + with open(file_name, "rb") as pkl_file: >=20 > + gen_cfg_data.__dict__ =3D marshal.load(pkl_file) >=20 > + gen_cfg_data.prepare_marshal(False) >=20 > + elif file_name.endswith('.yaml'): >=20 > + if gen_cfg_data.load_yaml(file_name) !=3D 0: >=20 > + raise Exception(gen_cfg_data.get_last_error()) >=20 > + else: >=20 > + raise Exception('Unsupported file "%s" !' % file_name) >=20 > + gen_cfg_data.detect_fsp() >=20 > + return gen_cfg_data >=20 > + >=20 > + def about(self): >=20 > + msg =3D 'Configuration Editor\n--------------------------------\= n \ >=20 > + Version 0.8\n2021' >=20 > + lines =3D msg.split('\n') >=20 > + width =3D 30 >=20 > + text =3D [] >=20 > + for line in lines: >=20 > + text.append(line.center(width, ' ')) >=20 > + messagebox.showinfo('Config Editor', '\n'.join(text)) >=20 > + >=20 > + def update_last_dir(self, path): >=20 > + self.last_dir =3D os.path.dirname(path) >=20 > + >=20 > + def get_open_file_name(self, ftype): >=20 > + if self.is_config_data_loaded(): >=20 > + if ftype =3D=3D 'dlt': >=20 > + question =3D '' >=20 > + elif ftype =3D=3D 'bin': >=20 > + question =3D 'All configuration will be reloaded from BI= N file, \ >=20 > + continue ?' >=20 > + elif ftype =3D=3D 'yaml': >=20 > + question =3D '' >=20 > + elif ftype =3D=3D 'bsf': >=20 > + question =3D '' >=20 > + else: >=20 > + raise Exception('Unsupported file type !') >=20 > + if question: >=20 > + reply =3D messagebox.askquestion('', question, icon=3D'w= arning') >=20 > + if reply =3D=3D 'no': >=20 > + return None >=20 > + >=20 > + if ftype =3D=3D 'yaml': >=20 > + if self.mode =3D=3D 'FSP': >=20 > + file_type =3D 'YAML' >=20 > + file_ext =3D 'yaml' >=20 > + else: >=20 > + file_type =3D 'YAML or PKL' >=20 > + file_ext =3D 'pkl *.yaml' >=20 > + else: >=20 > + file_type =3D ftype.upper() >=20 > + file_ext =3D ftype >=20 > + >=20 > + path =3D filedialog.askopenfilename( >=20 > + initialdir=3Dself.last_dir, >=20 > + title=3D"Load file", >=20 > + filetypes=3D(("%s files" % file_type, "*.%s" % file_ext)= , ( >=20 > + "all files", "*.*"))) >=20 > + if path: >=20 > + self.update_last_dir(path) >=20 > + return path >=20 > + else: >=20 > + return None >=20 > + >=20 > + def load_from_delta(self): >=20 > + path =3D self.get_open_file_name('dlt') >=20 > + if not path: >=20 > + return >=20 > + self.load_delta_file(path) >=20 > + >=20 > + def load_delta_file(self, path): >=20 > + self.reload_config_data_from_bin(self.org_cfg_data_bin) >=20 > + try: >=20 > + self.cfg_data_obj.override_default_value(path) >=20 > + except Exception as e: >=20 > + messagebox.showerror('LOADING ERROR', str(e)) >=20 > + return >=20 > + self.update_last_dir(path) >=20 > + self.refresh_config_data_page() >=20 > + >=20 > + def load_from_bin(self): >=20 > + path =3D self.get_open_file_name('bin') >=20 > + if not path: >=20 > + return >=20 > + self.load_bin_file(path) >=20 > + >=20 > + def load_bin_file(self, path): >=20 > + with open(path, 'rb') as fd: >=20 > + bin_data =3D bytearray(fd.read()) >=20 > + if len(bin_data) < len(self.org_cfg_data_bin): >=20 > + messagebox.showerror('Binary file size is smaller than what = \ >=20 > + YAML requires !') >=20 > + return >=20 > + >=20 > + try: >=20 > + self.reload_config_data_from_bin(bin_data) >=20 > + except Exception as e: >=20 > + messagebox.showerror('LOADING ERROR', str(e)) >=20 > + return >=20 > + >=20 > + def load_from_bsf_file(self): >=20 > + path =3D self.get_open_file_name('bsf') >=20 > + if not path: >=20 > + return >=20 > + self.load_bsf_file(path) >=20 > + >=20 > + def load_bsf_file(self, path): >=20 > + bsf_file =3D path >=20 > + dsc_file =3D os.path.splitext(bsf_file)[0] + '.dsc' >=20 > + yaml_file =3D os.path.splitext(bsf_file)[0] + '.yaml' >=20 > + bsf_to_dsc(bsf_file, dsc_file) >=20 > + dsc_to_yaml(dsc_file, yaml_file) >=20 > + >=20 > + self.load_cfg_file(yaml_file) >=20 > + return >=20 > + >=20 > + def load_from_fd(self): >=20 > + path =3D filedialog.askopenfilename( >=20 > + initialdir=3Dself.last_dir, >=20 > + title=3D"Load file", >=20 > + filetypes=3D{("Binaries", "*.fv *.fd *.bin *.rom")}) >=20 > + if not path: >=20 > + return >=20 > + self.load_fd_file(path) >=20 > + >=20 > + def load_fd_file(self, path): >=20 > + with open(path, 'rb') as fd: >=20 > + bin_data =3D bytearray(fd.read()) >=20 > + >=20 > + fd =3D FirmwareDevice(0, bin_data) >=20 > + fd.ParseFd() >=20 > + fd.ParseFsp() >=20 > + fd.OutputFsp() >=20 > + >=20 > + def load_cfg_file(self, path): >=20 > + # Save current values in widget and clear database >=20 > + self.clear_widgets_inLayout() >=20 > + self.left.delete(*self.left.get_children()) >=20 > + >=20 > + self.cfg_data_obj =3D self.load_config_data(path) >=20 > + >=20 > + self.update_last_dir(path) >=20 > + self.org_cfg_data_bin =3D self.cfg_data_obj.generate_binary_arra= y() >=20 > + self.build_config_page_tree(self.cfg_data_obj.get_cfg_page()['ro= ot'], >=20 > + '') >=20 > + >=20 > + for menu in self.menu_string: >=20 > + self.file_menu.entryconfig(menu, state=3D"normal") >=20 > + >=20 > + return 0 >=20 > + >=20 > + def load_from_yaml(self): >=20 > + path =3D self.get_open_file_name('yaml') >=20 > + if not path: >=20 > + return >=20 > + >=20 > + self.load_cfg_file(path) >=20 > + >=20 > + def get_save_file_name(self, extension): >=20 > + path =3D filedialog.asksaveasfilename( >=20 > + initialdir=3Dself.last_dir, >=20 > + title=3D"Save file", >=20 > + defaultextension=3Dextension) >=20 > + if path: >=20 > + self.last_dir =3D os.path.dirname(path) >=20 > + return path >=20 > + else: >=20 > + return None >=20 > + >=20 > + def save_delta_file(self, full=3DFalse): >=20 > + path =3D self.get_save_file_name(".dlt") >=20 > + if not path: >=20 > + return >=20 > + >=20 > + self.update_config_data_on_page() >=20 > + new_data =3D self.cfg_data_obj.generate_binary_array() >=20 > + self.cfg_data_obj.generate_delta_file_from_bin(path, >=20 > + self.org_cfg_data= _bin, >=20 > + new_data, full) >=20 > + >=20 > + def save_to_delta(self): >=20 > + self.save_delta_file() >=20 > + >=20 > + def save_full_to_delta(self): >=20 > + self.save_delta_file(True) >=20 > + >=20 > + def save_to_bin(self): >=20 > + path =3D self.get_save_file_name(".bin") >=20 > + if not path: >=20 > + return >=20 > + >=20 > + self.update_config_data_on_page() >=20 > + bins =3D self.cfg_data_obj.save_current_to_bin() >=20 > + >=20 > + with open(path, 'wb') as fd: >=20 > + fd.write(bins) >=20 > + >=20 > + def refresh_config_data_page(self): >=20 > + self.clear_widgets_inLayout() >=20 > + self.on_config_page_select_change(None) >=20 > + >=20 > + def reload_config_data_from_bin(self, bin_dat): >=20 > + self.cfg_data_obj.load_default_from_bin(bin_dat) >=20 > + self.refresh_config_data_page() >=20 > + >=20 > + def set_config_item_value(self, item, value_str): >=20 > + itype =3D item['type'].split(',')[0] >=20 > + if itype =3D=3D "Table": >=20 > + new_value =3D value_str >=20 > + elif itype =3D=3D "EditText": >=20 > + length =3D (self.cfg_data_obj.get_cfg_item_length(item) + 7)= // 8 >=20 > + new_value =3D value_str[:length] >=20 > + if item['value'].startswith("'"): >=20 > + new_value =3D "'%s'" % new_value >=20 > + else: >=20 > + try: >=20 > + new_value =3D self.cfg_data_obj.reformat_value_str( >=20 > + value_str, >=20 > + self.cfg_data_obj.get_cfg_item_length(item), >=20 > + item['value']) >=20 > + except Exception: >=20 > + print("WARNING: Failed to format value string '%s' for '= %s' !" >=20 > + % (value_str, item['path'])) >=20 > + new_value =3D item['value'] >=20 > + >=20 > + if item['value'] !=3D new_value: >=20 > + if self.debug: >=20 > + print('Update %s from %s to %s !' >=20 > + % (item['cname'], item['value'], new_value)) >=20 > + item['value'] =3D new_value >=20 > + >=20 > + def get_config_data_item_from_widget(self, widget, label=3DFalse): >=20 > + name =3D self.get_object_name(widget) >=20 > + if not name or not len(self.page_list): >=20 > + return None >=20 > + >=20 > + if name.startswith('LABEL_'): >=20 > + if label: >=20 > + path =3D name[6:] >=20 > + else: >=20 > + return None >=20 > + else: >=20 > + path =3D name >=20 > + item =3D self.cfg_data_obj.get_item_by_path(path) >=20 > + return item >=20 > + >=20 > + def update_config_data_from_widget(self, widget, args): >=20 > + item =3D self.get_config_data_item_from_widget(widget) >=20 > + if item is None: >=20 > + return >=20 > + elif not item: >=20 > + if isinstance(widget, tkinter.Label): >=20 > + return >=20 > + raise Exception('Failed to find "%s" !' % >=20 > + self.get_object_name(widget)) >=20 > + >=20 > + itype =3D item['type'].split(',')[0] >=20 > + if itype =3D=3D "Combo": >=20 > + opt_list =3D self.cfg_data_obj.get_cfg_item_options(item) >=20 > + tmp_list =3D [opt[0] for opt in opt_list] >=20 > + idx =3D widget.current() >=20 > + self.set_config_item_value(item, tmp_list[idx]) >=20 > + elif itype in ["EditNum", "EditText"]: >=20 > + self.set_config_item_value(item, widget.get()) >=20 > + elif itype in ["Table"]: >=20 > + new_value =3D bytes_to_bracket_str(widget.get()) >=20 > + self.set_config_item_value(item, new_value) >=20 > + >=20 > + def evaluate_condition(self, item): >=20 > + try: >=20 > + result =3D self.cfg_data_obj.evaluate_condition(item) >=20 > + except Exception: >=20 > + print("WARNING: Condition '%s' is invalid for '%s' !" >=20 > + % (item['condition'], item['path'])) >=20 > + result =3D 1 >=20 > + return result >=20 > + >=20 > + def add_config_item(self, item, row): >=20 > + parent =3D self.right_grid >=20 > + >=20 > + name =3D tkinter.Label(parent, text=3Ditem['name'], anchor=3D"w"= ) >=20 > + >=20 > + parts =3D item['type'].split(',') >=20 > + itype =3D parts[0] >=20 > + widget =3D None >=20 > + >=20 > + if itype =3D=3D "Combo": >=20 > + # Build >=20 > + opt_list =3D self.cfg_data_obj.get_cfg_item_options(item) >=20 > + current_value =3D self.cfg_data_obj.get_cfg_item_value(item,= False) >=20 > + option_list =3D [] >=20 > + current =3D None >=20 > + >=20 > + for idx, option in enumerate(opt_list): >=20 > + option_str =3D option[0] >=20 > + try: >=20 > + option_value =3D self.cfg_data_obj.get_value( >=20 > + option_str, >=20 > + len(option_str), False) >=20 > + except Exception: >=20 > + option_value =3D 0 >=20 > + print('WARNING: Option "%s" has invalid format for "= %s" !' >=20 > + % (option_str, item['path'])) >=20 > + if option_value =3D=3D current_value: >=20 > + current =3D idx >=20 > + option_list.append(option[1]) >=20 > + >=20 > + widget =3D ttk.Combobox(parent, value=3Doption_list, state= =3D"readonly") >=20 > + widget.bind("<>", self.combo_select_change= d) >=20 > + widget.unbind_class("TCombobox", "") >=20 > + >=20 > + if current is None: >=20 > + print('WARNING: Value "%s" is an invalid option for "%s"= !' % >=20 > + (current_value, item['path'])) >=20 > + else: >=20 > + widget.current(current) >=20 > + >=20 > + elif itype in ["EditNum", "EditText"]: >=20 > + txt_val =3D tkinter.StringVar() >=20 > + widget =3D tkinter.Entry(parent, textvariable=3Dtxt_val) >=20 > + value =3D item['value'].strip("'") >=20 > + if itype in ["EditText"]: >=20 > + txt_val.trace( >=20 > + 'w', >=20 > + lambda *args: self.limit_entry_size >=20 > + (txt_val, (self.cfg_data_obj.get_cfg_item_length(ite= m) >=20 > + + 7) // 8)) >=20 > + elif itype in ["EditNum"]: >=20 > + value =3D item['value'].strip("{").strip("}").strip() >=20 > + widget.bind("", self.edit_num_finished) >=20 > + txt_val.set(value) >=20 > + >=20 > + elif itype in ["Table"]: >=20 > + bins =3D self.cfg_data_obj.get_cfg_item_value(item, True) >=20 > + col_hdr =3D item['option'].split(',') >=20 > + widget =3D custom_table(parent, col_hdr, bins) >=20 > + >=20 > + else: >=20 > + if itype and itype not in ["Reserved"]: >=20 > + print("WARNING: Type '%s' is invalid for '%s' !" % >=20 > + (itype, item['path'])) >=20 > + >=20 > + if widget: >=20 > + create_tool_tip(widget, item['help']) >=20 > + self.set_object_name(name, 'LABEL_' + item['path']) >=20 > + self.set_object_name(widget, item['path']) >=20 > + name.grid(row=3Drow, column=3D0, padx=3D10, pady=3D5, sticky= =3D"nsew") >=20 > + widget.grid(row=3Drow + 1, rowspan=3D1, column=3D0, >=20 > + padx=3D10, pady=3D5, sticky=3D"nsew") >=20 > + >=20 > + def update_config_data_on_page(self): >=20 > + self.walk_widgets_in_layout(self.right_grid, >=20 > + self.update_config_data_from_widget) >=20 > + >=20 > + >=20 > +if __name__ =3D=3D '__main__': >=20 > + root =3D tkinter.Tk() >=20 > + app =3D application(master=3Droot) >=20 > + root.title("Config Editor") >=20 > + root.mainloop() >=20 > diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/FspDscBsf2Yaml.py > b/IntelFsp2Pkg/Tools/ConfigEditor/FspDscBsf2Yaml.py > new file mode 100644 > index 0000000000..f9b2503414 > --- /dev/null > +++ b/IntelFsp2Pkg/Tools/ConfigEditor/FspDscBsf2Yaml.py > @@ -0,0 +1,664 @@ > +#!/usr/bin/env python >=20 > +# @ FspBsf2Dsc.py >=20 > +# This script convert FSP BSF format into DSC format >=20 > +# Copyright (c) 2020, Intel Corporation. All rights reserved.
>=20 > +# SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > +# >=20 > +## >=20 > + >=20 > +import os >=20 > +import re >=20 > +import sys >=20 > + >=20 > +from collections import OrderedDict >=20 > +from datetime import date >=20 > + >=20 > +from FspGenCfgData import CFspBsf2Dsc, CGenCfgData >=20 > + >=20 > +__copyright_tmp__ =3D """## @file >=20 > +# >=20 > +# Slim Bootloader CFGDATA %s File. >=20 > +# >=20 > +# Copyright (c) %4d, Intel Corporation. All rights reserved.
>=20 > +# SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > +# >=20 > +## >=20 > +""" >=20 > + >=20 > + >=20 > +class CFspDsc2Yaml(): >=20 > + >=20 > + def __init__(self): >=20 > + self._Hdr_key_list =3D ['EMBED', 'STRUCT'] >=20 > + self._Bsf_key_list =3D ['NAME', 'HELP', 'TYPE', 'PAGE', 'PAGES', >=20 > + 'OPTION', 'CONDITION', 'ORDER', 'MARKER', >=20 > + 'SUBT', 'FIELD', 'FIND'] >=20 > + self.gen_cfg_data =3D None >=20 > + self.cfg_reg_exp =3D re.compile( >=20 > + "^([_a-zA-Z0-9$\\(\\)]+)\\s*\\|\\s*(0x[0-9A-F]+|\\*)" >=20 > + "\\s*\\|\\s*(\\d+|0x[0-9a-fA-F]+)\\s*\\|\\s*(.+)") >=20 > + self.bsf_reg_exp =3D re.compile("(%s):{(.+?)}(?:$|\\s+)" >=20 > + % '|'.join(self._Bsf_key_list)) >=20 > + self.hdr_reg_exp =3D re.compile("(%s):{(.+?)}" >=20 > + % '|'.join(self._Hdr_key_list)) >=20 > + self.prefix =3D '' >=20 > + self.unused_idx =3D 0 >=20 > + self.offset =3D 0 >=20 > + self.base_offset =3D 0 >=20 > + >=20 > + def load_config_data_from_dsc(self, file_name): >=20 > + """ >=20 > + Load and parse a DSC CFGDATA file. >=20 > + """ >=20 > + gen_cfg_data =3D CGenCfgData('FSP') >=20 > + if file_name.endswith('.dsc'): >=20 > + if gen_cfg_data.ParseDscFile(file_name) !=3D 0: >=20 > + raise Exception('DSC file parsing error !') >=20 > + if gen_cfg_data.CreateVarDict() !=3D 0: >=20 > + raise Exception('DSC variable creation error !') >=20 > + else: >=20 > + raise Exception('Unsupported file "%s" !' % file_name) >=20 > + gen_cfg_data.UpdateDefaultValue() >=20 > + self.gen_cfg_data =3D gen_cfg_data >=20 > + >=20 > + def print_dsc_line(self): >=20 > + """ >=20 > + Debug function to print all DSC lines. >=20 > + """ >=20 > + for line in self.gen_cfg_data._DscLines: >=20 > + print(line) >=20 > + >=20 > + def format_value(self, field, text, indent=3D''): >=20 > + """ >=20 > + Format a CFGDATA item into YAML format. >=20 > + """ >=20 > + if (not text.startswith('!expand')) and (': ' in text): >=20 > + tgt =3D ':' if field =3D=3D 'option' else '- ' >=20 > + text =3D text.replace(': ', tgt) >=20 > + lines =3D text.splitlines() >=20 > + if len(lines) =3D=3D 1 and field !=3D 'help': >=20 > + return text >=20 > + else: >=20 > + return '>\n ' + '\n '.join( >=20 > + [indent + i.lstrip() for i in lines]) >=20 > + >=20 > + def reformat_pages(self, val): >=20 > + # Convert XXX:YYY into XXX::YYY format for page definition >=20 > + parts =3D val.split(',') >=20 > + if len(parts) <=3D 1: >=20 > + return val >=20 > + >=20 > + new_val =3D [] >=20 > + for each in parts: >=20 > + nodes =3D each.split(':') >=20 > + if len(nodes) =3D=3D 2: >=20 > + each =3D '%s::%s' % (nodes[0], nodes[1]) >=20 > + new_val.append(each) >=20 > + ret =3D ','.join(new_val) >=20 > + return ret >=20 > + >=20 > + def reformat_struct_value(self, utype, val): >=20 > + # Convert DSC UINT16/32/64 array into new format by >=20 > + # adding prefix 0:0[WDQ] to provide hint to the array format >=20 > + if utype in ['UINT16', 'UINT32', 'UINT64']: >=20 > + if val and val[0] =3D=3D '{' and val[-1] =3D=3D '}': >=20 > + if utype =3D=3D 'UINT16': >=20 > + unit =3D 'W' >=20 > + elif utype =3D=3D 'UINT32': >=20 > + unit =3D 'D' >=20 > + else: >=20 > + unit =3D 'Q' >=20 > + val =3D '{ 0:0%s, %s }' % (unit, val[1:-1]) >=20 > + return val >=20 > + >=20 > + def process_config(self, cfg): >=20 > + if 'page' in cfg: >=20 > + cfg['page'] =3D self.reformat_pages(cfg['page']) >=20 > + >=20 > + if 'struct' in cfg: >=20 > + cfg['value'] =3D self.reformat_struct_value( >=20 > + cfg['struct'], cfg['value']) >=20 > + >=20 > + def parse_dsc_line(self, dsc_line, config_dict, init_dict, include): >=20 > + """ >=20 > + Parse a line in DSC and update the config dictionary accordingly= . >=20 > + """ >=20 > + init_dict.clear() >=20 > + match =3D re.match('g(CfgData|\\w+FspPkgTokenSpaceGuid)\\.(.+)', >=20 > + dsc_line) >=20 > + if match: >=20 > + match =3D self.cfg_reg_exp.match(match.group(2)) >=20 > + if not match: >=20 > + return False >=20 > + config_dict['cname'] =3D self.prefix + match.group(1) >=20 > + value =3D match.group(4).strip() >=20 > + length =3D match.group(3).strip() >=20 > + config_dict['length'] =3D length >=20 > + config_dict['value'] =3D value >=20 > + if match.group(2) =3D=3D '*': >=20 > + self.offset +=3D int(length, 0) >=20 > + else: >=20 > + org_offset =3D int(match.group(2), 0) >=20 > + if org_offset =3D=3D 0: >=20 > + self.base_offset =3D self.offset >=20 > + offset =3D org_offset + self.base_offset >=20 > + if self.offset !=3D offset: >=20 > + if offset > self.offset: >=20 > + init_dict['padding'] =3D offset - self.offset >=20 > + self.offset =3D offset + int(length, 0) >=20 > + return True >=20 > + >=20 > + match =3D re.match("^\\s*#\\s+!([<>])\\s+include\\s+(.+)", dsc_l= ine) >=20 > + if match and len(config_dict) =3D=3D 0: >=20 > + # !include should not be inside a config field >=20 > + # if so, do not convert include into YAML >=20 > + init_dict =3D dict(config_dict) >=20 > + config_dict.clear() >=20 > + config_dict['cname'] =3D '$ACTION' >=20 > + if match.group(1) =3D=3D '<': >=20 > + config_dict['include'] =3D match.group(2) >=20 > + else: >=20 > + config_dict['include'] =3D '' >=20 > + return True >=20 > + >=20 > + match =3D re.match("^\\s*#\\s+(!BSF|!HDR)\\s+(.+)", dsc_line) >=20 > + if not match: >=20 > + return False >=20 > + >=20 > + remaining =3D match.group(2) >=20 > + if match.group(1) =3D=3D '!BSF': >=20 > + result =3D self.bsf_reg_exp.findall(remaining) >=20 > + if not result: >=20 > + return False >=20 > + >=20 > + for each in result: >=20 > + key =3D each[0].lower() >=20 > + val =3D each[1] >=20 > + if key =3D=3D 'field': >=20 > + name =3D each[1] >=20 > + if ':' not in name: >=20 > + raise Exception('Incorrect bit field format !') >=20 > + parts =3D name.split(':') >=20 > + config_dict['length'] =3D parts[1] >=20 > + config_dict['cname'] =3D '@' + parts[0] >=20 > + return True >=20 > + elif key in ['pages', 'page', 'find']: >=20 > + init_dict =3D dict(config_dict) >=20 > + config_dict.clear() >=20 > + config_dict['cname'] =3D '$ACTION' >=20 > + if key =3D=3D 'find': >=20 > + config_dict['find'] =3D val >=20 > + else: >=20 > + config_dict['page'] =3D val >=20 > + return True >=20 > + elif key =3D=3D 'subt': >=20 > + config_dict.clear() >=20 > + parts =3D each[1].split(':') >=20 > + tmp_name =3D parts[0][:-5] >=20 > + if tmp_name =3D=3D 'CFGHDR': >=20 > + cfg_tag =3D '_$FFF_' >=20 > + sval =3D '!expand { %s_TMPL : [ ' % \ >=20 > + tmp_name + '%s, %s, ' % (parts[1], cfg_tag) = + \ >=20 > + ', '.join(parts[2:]) + ' ] }' >=20 > + else: >=20 > + sval =3D '!expand { %s_TMPL : [ ' % \ >=20 > + tmp_name + ', '.join(parts[1:]) + ' ] }' >=20 > + config_dict.clear() >=20 > + config_dict['cname'] =3D tmp_name >=20 > + config_dict['expand'] =3D sval >=20 > + return True >=20 > + else: >=20 > + if key in ['name', 'help', 'option'] and \ >=20 > + val.startswith('+'): >=20 > + val =3D config_dict[key] + '\n' + val[1:] >=20 > + if val.strip() =3D=3D '': >=20 > + val =3D "''" >=20 > + config_dict[key] =3D val >=20 > + >=20 > + else: >=20 > + match =3D self.hdr_reg_exp.match(remaining) >=20 > + if not match: >=20 > + return False >=20 > + key =3D match.group(1) >=20 > + remaining =3D match.group(2) >=20 > + if key =3D=3D 'EMBED': >=20 > + parts =3D remaining.split(':') >=20 > + names =3D parts[0].split(',') >=20 > + if parts[-1] =3D=3D 'END': >=20 > + prefix =3D '>' >=20 > + else: >=20 > + prefix =3D '<' >=20 > + skip =3D False >=20 > + if parts[1].startswith('TAG_'): >=20 > + tag_txt =3D '%s:%s' % (names[0], parts[1]) >=20 > + else: >=20 > + tag_txt =3D names[0] >=20 > + if parts[2] in ['START', 'END']: >=20 > + if names[0] =3D=3D 'PCIE_RP_PIN_CTRL[]': >=20 > + skip =3D True >=20 > + else: >=20 > + tag_txt =3D '%s:%s' % (names[0], parts[1]) >=20 > + if not skip: >=20 > + config_dict.clear() >=20 > + config_dict['cname'] =3D prefix + tag_txt >=20 > + return True >=20 > + >=20 > + if key =3D=3D 'STRUCT': >=20 > + text =3D remaining.strip() >=20 > + config_dict[key.lower()] =3D text >=20 > + >=20 > + return False >=20 > + >=20 > + def process_template_lines(self, lines): >=20 > + """ >=20 > + Process a line in DSC template section. >=20 > + """ >=20 > + template_name =3D '' >=20 > + bsf_temp_dict =3D OrderedDict() >=20 > + temp_file_dict =3D OrderedDict() >=20 > + include_file =3D ['.'] >=20 > + >=20 > + for line in lines: >=20 > + match =3D re.match("^\\s*#\\s+!([<>])\\s+include\\s+(.+)", l= ine) >=20 > + if match: >=20 > + if match.group(1) =3D=3D '<': >=20 > + include_file.append(match.group(2)) >=20 > + else: >=20 > + include_file.pop() >=20 > + >=20 > + match =3D re.match( >=20 > + "^\\s*#\\s+(!BSF)\\s+DEFT:{(.+?):(START|END)}", line) >=20 > + if match: >=20 > + if match.group(3) =3D=3D 'START' and not template_name: >=20 > + template_name =3D match.group(2).strip() >=20 > + temp_file_dict[template_name] =3D list(include_file) >=20 > + bsf_temp_dict[template_name] =3D [] >=20 > + if match.group(3) =3D=3D 'END' and \ >=20 > + (template_name =3D=3D match.group(2).strip()) an= d \ >=20 > + template_name: >=20 > + template_name =3D '' >=20 > + else: >=20 > + if template_name: >=20 > + bsf_temp_dict[template_name].append(line) >=20 > + return bsf_temp_dict, temp_file_dict >=20 > + >=20 > + def process_option_lines(self, lines): >=20 > + """ >=20 > + Process a line in DSC config section. >=20 > + """ >=20 > + cfgs =3D [] >=20 > + struct_end =3D False >=20 > + config_dict =3D dict() >=20 > + init_dict =3D dict() >=20 > + include =3D [''] >=20 > + for line in lines: >=20 > + ret =3D self.parse_dsc_line(line, config_dict, init_dict, in= clude) >=20 > + if ret: >=20 > + if 'padding' in init_dict: >=20 > + num =3D init_dict['padding'] >=20 > + init_dict.clear() >=20 > + padding_dict =3D {} >=20 > + cfgs.append(padding_dict) >=20 > + padding_dict['cname'] =3D 'UnusedUpdSpace%d' % \ >=20 > + self.unused_idx >=20 > + padding_dict['length'] =3D '0x%x' % num >=20 > + padding_dict['value'] =3D '{ 0 }' >=20 > + self.unused_idx +=3D 1 >=20 > + >=20 > + if cfgs and cfgs[-1]['cname'][0] !=3D '@' and \ >=20 > + config_dict['cname'][0] =3D=3D '@': >=20 > + # it is a bit field, mark the previous one as virtua= l >=20 > + cname =3D cfgs[-1]['cname'] >=20 > + new_cfg =3D dict(cfgs[-1]) >=20 > + new_cfg['cname'] =3D '@$STRUCT' >=20 > + cfgs[-1].clear() >=20 > + cfgs[-1]['cname'] =3D cname >=20 > + cfgs.append(new_cfg) >=20 > + >=20 > + if cfgs and cfgs[-1]['cname'] =3D=3D 'CFGHDR' and \ >=20 > + config_dict['cname'][0] =3D=3D '<': >=20 > + # swap CfgHeader and the CFG_DATA order >=20 > + if ':' in config_dict['cname']: >=20 > + # replace the real TAG for CFG_DATA >=20 > + cfgs[-1]['expand'] =3D cfgs[-1]['expand'].replac= e( >=20 > + '_$FFF_', '0x%s' % >=20 > + config_dict['cname'].split(':')[1][4:]) >=20 > + cfgs.insert(-1, config_dict) >=20 > + else: >=20 > + self.process_config(config_dict) >=20 > + if struct_end: >=20 > + struct_end =3D False >=20 > + cfgs.insert(-1, config_dict) >=20 > + else: >=20 > + cfgs.append(config_dict) >=20 > + if config_dict['cname'][0] =3D=3D '>': >=20 > + struct_end =3D True >=20 > + >=20 > + config_dict =3D dict(init_dict) >=20 > + return cfgs >=20 > + >=20 > + def variable_fixup(self, each): >=20 > + """ >=20 > + Fix up some variable definitions for SBL. >=20 > + """ >=20 > + key =3D each >=20 > + val =3D self.gen_cfg_data._MacroDict[each] >=20 > + return key, val >=20 > + >=20 > + def template_fixup(self, tmp_name, tmp_list): >=20 > + """ >=20 > + Fix up some special config templates for SBL >=20 > + """ >=20 > + return >=20 > + >=20 > + def config_fixup(self, cfg_list): >=20 > + """ >=20 > + Fix up some special config items for SBL. >=20 > + """ >=20 > + >=20 > + # Insert FSPT_UPD/FSPM_UPD/FSPS_UPD tag so as to create C strctu= re >=20 > + idxs =3D [] >=20 > + for idx, cfg in enumerate(cfg_list): >=20 > + if cfg['cname'].startswith('=20 > + idxs.append(idx) >=20 > + >=20 > + if len(idxs) !=3D 3: >=20 > + return >=20 > + >=20 > + # Handle insert backwards so that the index does not change in t= he loop >=20 > + fsp_comp =3D 'SMT' >=20 > + idx_comp =3D 0 >=20 > + for idx in idxs[::-1]: >=20 > + # Add current FSP?_UPD start tag >=20 > + cfgfig_dict =3D {} >=20 > + cfgfig_dict['cname'] =3D '=20 > + cfg_list.insert(idx, cfgfig_dict) >=20 > + if idx_comp < 2: >=20 > + # Add previous FSP?_UPD end tag >=20 > + cfgfig_dict =3D {} >=20 > + cfgfig_dict['cname'] =3D '>FSP%s_UPD' % fsp_comp[idx_com= p + 1] >=20 > + cfg_list.insert(idx, cfgfig_dict) >=20 > + idx_comp +=3D 1 >=20 > + >=20 > + # Add final FSPS_UPD end tag >=20 > + cfgfig_dict =3D {} >=20 > + cfgfig_dict['cname'] =3D '>FSP%s_UPD' % fsp_comp[0] >=20 > + cfg_list.append(cfgfig_dict) >=20 > + >=20 > + return >=20 > + >=20 > + def get_section_range(self, section_name): >=20 > + """ >=20 > + Extract line number range from config file for a given section n= ame. >=20 > + """ >=20 > + start =3D -1 >=20 > + end =3D -1 >=20 > + for idx, line in enumerate(self.gen_cfg_data._DscLines): >=20 > + if start < 0 and line.startswith('[%s]' % section_name): >=20 > + start =3D idx >=20 > + elif start >=3D 0 and line.startswith('['): >=20 > + end =3D idx >=20 > + break >=20 > + if start =3D=3D -1: >=20 > + start =3D 0 >=20 > + if end =3D=3D -1: >=20 > + end =3D len(self.gen_cfg_data._DscLines) >=20 > + return start, end >=20 > + >=20 > + def normalize_file_name(self, file, is_temp=3DFalse): >=20 > + """ >=20 > + Normalize file name convention so that it is consistent. >=20 > + """ >=20 > + if file.endswith('.dsc'): >=20 > + file =3D file[:-4] + '.yaml' >=20 > + dir_name =3D os.path.dirname(file) >=20 > + base_name =3D os.path.basename(file) >=20 > + if is_temp: >=20 > + if 'Template_' not in file: >=20 > + base_name =3D base_name.replace('Template', 'Template_') >=20 > + else: >=20 > + if 'CfgData_' not in file: >=20 > + base_name =3D base_name.replace('CfgData', 'CfgData_') >=20 > + if dir_name: >=20 > + path =3D dir_name + '/' + base_name >=20 > + else: >=20 > + path =3D base_name >=20 > + return path >=20 > + >=20 > + def output_variable(self): >=20 > + """ >=20 > + Output variable block into a line list. >=20 > + """ >=20 > + lines =3D [] >=20 > + for each in self.gen_cfg_data._MacroDict: >=20 > + key, value =3D self.variable_fixup(each) >=20 > + lines.append('%-30s : %s' % (key, value)) >=20 > + return lines >=20 > + >=20 > + def output_template(self): >=20 > + """ >=20 > + Output template block into a line list. >=20 > + """ >=20 > + self.offset =3D 0 >=20 > + self.base_offset =3D 0 >=20 > + start, end =3D self.get_section_range('PcdsDynamicVpd.Tmp') >=20 > + bsf_temp_dict, temp_file_dict =3D self.process_template_lines( >=20 > + self.gen_cfg_data._DscLines[start:end]) >=20 > + template_dict =3D dict() >=20 > + lines =3D [] >=20 > + file_lines =3D {} >=20 > + last_file =3D '.' >=20 > + file_lines[last_file] =3D [] >=20 > + >=20 > + for tmp_name in temp_file_dict: >=20 > + temp_file_dict[tmp_name][-1] =3D self.normalize_file_name( >=20 > + temp_file_dict[tmp_name][-1], True) >=20 > + if len(temp_file_dict[tmp_name]) > 1: >=20 > + temp_file_dict[tmp_name][-2] =3D self.normalize_file_nam= e( >=20 > + temp_file_dict[tmp_name][-2], True) >=20 > + >=20 > + for tmp_name in bsf_temp_dict: >=20 > + file =3D temp_file_dict[tmp_name][-1] >=20 > + if last_file !=3D file and len(temp_file_dict[tmp_name]) > 1= : >=20 > + inc_file =3D temp_file_dict[tmp_name][-2] >=20 > + file_lines[inc_file].extend( >=20 > + ['', '- !include %s' % temp_file_dict[tmp_name][-1],= '']) >=20 > + last_file =3D file >=20 > + if file not in file_lines: >=20 > + file_lines[file] =3D [] >=20 > + lines =3D file_lines[file] >=20 > + text =3D bsf_temp_dict[tmp_name] >=20 > + tmp_list =3D self.process_option_lines(text) >=20 > + self.template_fixup(tmp_name, tmp_list) >=20 > + template_dict[tmp_name] =3D tmp_list >=20 > + lines.append('%s: >' % tmp_name) >=20 > + lines.extend(self.output_dict(tmp_list, False)['.']) >=20 > + lines.append('\n') >=20 > + return file_lines >=20 > + >=20 > + def output_config(self): >=20 > + """ >=20 > + Output config block into a line list. >=20 > + """ >=20 > + self.offset =3D 0 >=20 > + self.base_offset =3D 0 >=20 > + start, end =3D self.get_section_range('PcdsDynamicVpd.Upd') >=20 > + cfgs =3D self.process_option_lines( >=20 > + self.gen_cfg_data._DscLines[start:end]) >=20 > + self.config_fixup(cfgs) >=20 > + file_lines =3D self.output_dict(cfgs, True) >=20 > + return file_lines >=20 > + >=20 > + def output_dict(self, cfgs, is_configs): >=20 > + """ >=20 > + Output one config item into a line list. >=20 > + """ >=20 > + file_lines =3D {} >=20 > + level =3D 0 >=20 > + file =3D '.' >=20 > + for each in cfgs: >=20 > + if 'length' in each and int(each['length'], 0) =3D=3D 0: >=20 > + continue >=20 > + >=20 > + if 'include' in each: >=20 > + if each['include']: >=20 > + each['include'] =3D self.normalize_file_name( >=20 > + each['include']) >=20 > + file_lines[file].extend( >=20 > + ['', '- !include %s' % each['include'], '']) >=20 > + file =3D each['include'] >=20 > + else: >=20 > + file =3D '.' >=20 > + continue >=20 > + >=20 > + if file not in file_lines: >=20 > + file_lines[file] =3D [] >=20 > + >=20 > + lines =3D file_lines[file] >=20 > + name =3D each['cname'] >=20 > + >=20 > + prefix =3D name[0] >=20 > + if prefix =3D=3D '<': >=20 > + level +=3D 1 >=20 > + >=20 > + padding =3D ' ' * level >=20 > + if prefix not in '<>@': >=20 > + padding +=3D ' ' >=20 > + else: >=20 > + name =3D name[1:] >=20 > + if prefix =3D=3D '@': >=20 > + padding +=3D ' ' >=20 > + >=20 > + if ':' in name: >=20 > + parts =3D name.split(':') >=20 > + name =3D parts[0] >=20 > + >=20 > + padding =3D padding[2:] if is_configs else padding >=20 > + >=20 > + if prefix !=3D '>': >=20 > + if 'expand' in each: >=20 > + lines.append('%s- %s' % (padding, each['expand'])) >=20 > + else: >=20 > + lines.append('%s- %-12s :' % (padding, name)) >=20 > + >=20 > + for field in each: >=20 > + if field in ['cname', 'expand', 'include']: >=20 > + continue >=20 > + value_str =3D self.format_value( >=20 > + field, each[field], padding + ' ' * 16) >=20 > + full_line =3D ' %s %-12s : %s' % (padding, field, valu= e_str) >=20 > + lines.extend(full_line.splitlines()) >=20 > + >=20 > + if prefix =3D=3D '>': >=20 > + level -=3D 1 >=20 > + if level =3D=3D 0: >=20 > + lines.append('') >=20 > + >=20 > + return file_lines >=20 > + >=20 > + >=20 > +def bsf_to_dsc(bsf_file, dsc_file): >=20 > + fsp_dsc =3D CFspBsf2Dsc(bsf_file) >=20 > + dsc_lines =3D fsp_dsc.get_dsc_lines() >=20 > + fd =3D open(dsc_file, 'w') >=20 > + fd.write('\n'.join(dsc_lines)) >=20 > + fd.close() >=20 > + return >=20 > + >=20 > + >=20 > +def dsc_to_yaml(dsc_file, yaml_file): >=20 > + dsc2yaml =3D CFspDsc2Yaml() >=20 > + dsc2yaml.load_config_data_from_dsc(dsc_file) >=20 > + >=20 > + cfgs =3D {} >=20 > + for cfg in ['Template', 'Option']: >=20 > + if cfg =3D=3D 'Template': >=20 > + file_lines =3D dsc2yaml.output_template() >=20 > + else: >=20 > + file_lines =3D dsc2yaml.output_config() >=20 > + for file in file_lines: >=20 > + lines =3D file_lines[file] >=20 > + if file =3D=3D '.': >=20 > + cfgs[cfg] =3D lines >=20 > + else: >=20 > + if ('/' in file or '\\' in file): >=20 > + continue >=20 > + file =3D os.path.basename(file) >=20 > + out_dir =3D os.path.dirname(file) >=20 > + fo =3D open(os.path.join(out_dir, file), 'w') >=20 > + fo.write(__copyright_tmp__ % ( >=20 > + cfg, date.today().year) + '\n\n') >=20 > + for line in lines: >=20 > + fo.write(line + '\n') >=20 > + fo.close() >=20 > + >=20 > + variables =3D dsc2yaml.output_variable() >=20 > + fo =3D open(yaml_file, 'w') >=20 > + fo.write(__copyright_tmp__ % ('Default', date.today().year)) >=20 > + if len(variables) > 0: >=20 > + fo.write('\n\nvariable:\n') >=20 > + for line in variables: >=20 > + fo.write(' ' + line + '\n') >=20 > + >=20 > + fo.write('\n\ntemplate:\n') >=20 > + for line in cfgs['Template']: >=20 > + fo.write(' ' + line + '\n') >=20 > + >=20 > + fo.write('\n\nconfigs:\n') >=20 > + for line in cfgs['Option']: >=20 > + fo.write(' ' + line + '\n') >=20 > + >=20 > + fo.close() >=20 > + >=20 > + >=20 > +def get_fsp_name_from_path(bsf_file): >=20 > + name =3D '' >=20 > + parts =3D bsf_file.split(os.sep) >=20 > + for part in parts: >=20 > + if part.endswith('FspBinPkg'): >=20 > + name =3D part[:-9] >=20 > + break >=20 > + if not name: >=20 > + raise Exception('Could not get FSP name from file path!') >=20 > + return name >=20 > + >=20 > + >=20 > +def usage(): >=20 > + print('\n'.join([ >=20 > + "FspDscBsf2Yaml Version 0.10", >=20 > + "Usage:", >=20 > + " FspDscBsf2Yaml BsfFile|DscFile YamlFile" >=20 > + ])) >=20 > + >=20 > + >=20 > +def main(): >=20 > + # >=20 > + # Parse the options and args >=20 > + # >=20 > + argc =3D len(sys.argv) >=20 > + if argc < 3: >=20 > + usage() >=20 > + return 1 >=20 > + >=20 > + bsf_file =3D sys.argv[1] >=20 > + yaml_file =3D sys.argv[2] >=20 > + if os.path.isdir(yaml_file): >=20 > + yaml_file =3D os.path.join( >=20 > + yaml_file, get_fsp_name_from_path(bsf_file) + '.yaml') >=20 > + >=20 > + if bsf_file.endswith('.dsc'): >=20 > + dsc_file =3D bsf_file >=20 > + bsf_file =3D '' >=20 > + else: >=20 > + dsc_file =3D os.path.splitext(yaml_file)[0] + '.dsc' >=20 > + bsf_to_dsc(bsf_file, dsc_file) >=20 > + >=20 > + dsc_to_yaml(dsc_file, yaml_file) >=20 > + >=20 > + print("'%s' was created successfully!" % yaml_file) >=20 > + >=20 > + return 0 >=20 > + >=20 > + >=20 > +if __name__ =3D=3D '__main__': >=20 > + sys.exit(main()) >=20 > diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/FspGenCfgData.py > b/IntelFsp2Pkg/Tools/ConfigEditor/FspGenCfgData.py > new file mode 100644 > index 0000000000..c37b37a876 > --- /dev/null > +++ b/IntelFsp2Pkg/Tools/ConfigEditor/FspGenCfgData.py > @@ -0,0 +1,2598 @@ > +# @ GenCfgData.py >=20 > +# >=20 > +# Copyright (c) 2014 - 2018, Intel Corporation. All rights reserved.
>=20 > +# SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > +# >=20 > +## >=20 > + >=20 > +import os >=20 > +import re >=20 > +import sys >=20 > +import marshal >=20 > +from functools import reduce >=20 > +from datetime import date >=20 > + >=20 > +# Generated file copyright header >=20 > + >=20 > +__copyright_tmp__ =3D """/** @file >=20 > + >=20 > + Configuration %s File. >=20 > + >=20 > + Copyright (c) %4d, Intel Corporation. All rights reserved.
>=20 > + SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > + >=20 > + This file is automatically generated. Please do NOT modify !!! >=20 > + >=20 > +**/ >=20 > +""" >=20 > + >=20 > +__copyright_dsc__ =3D """## @file >=20 > +# >=20 > +# Copyright (c) %04d, Intel Corporation. All rights reserved.
>=20 > +# SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > +# >=20 > +## >=20 > + >=20 > +[PcdsDynamicVpd.Upd] >=20 > + # >=20 > + # Global definitions in BSF >=20 > + # !BSF BLOCK:{NAME:"FSP UPD Configuration", VER:"0.1"} >=20 > + # >=20 > + >=20 > +""" >=20 > + >=20 > + >=20 > +def Bytes2Val(Bytes): >=20 > + return reduce(lambda x, y: (x << 8) | y, Bytes[::-1]) >=20 > + >=20 > + >=20 > +def Bytes2Str(Bytes): >=20 > + return '{ %s }' % (', '.join('0x%02X' % i for i in Bytes)) >=20 > + >=20 > + >=20 > +def Str2Bytes(Value, Blen): >=20 > + Result =3D bytearray(Value[1:-1], 'utf-8') # Excluding quotes >=20 > + if len(Result) < Blen: >=20 > + Result.extend(b'\x00' * (Blen - len(Result))) >=20 > + return Result >=20 > + >=20 > + >=20 > +def Val2Bytes(Value, Blen): >=20 > + return [(Value >> (i * 8) & 0xff) for i in range(Blen)] >=20 > + >=20 > + >=20 > +def Array2Val(ValStr): >=20 > + ValStr =3D ValStr.strip() >=20 > + if ValStr.startswith('{'): >=20 > + ValStr =3D ValStr[1:] >=20 > + if ValStr.endswith('}'): >=20 > + ValStr =3D ValStr[:-1] >=20 > + if ValStr.startswith("'"): >=20 > + ValStr =3D ValStr[1:] >=20 > + if ValStr.endswith("'"): >=20 > + ValStr =3D ValStr[:-1] >=20 > + Value =3D 0 >=20 > + for Each in ValStr.split(',')[::-1]: >=20 > + Each =3D Each.strip() >=20 > + if Each.startswith('0x'): >=20 > + Base =3D 16 >=20 > + else: >=20 > + Base =3D 10 >=20 > + Value =3D (Value << 8) | int(Each, Base) >=20 > + return Value >=20 > + >=20 > + >=20 > +def GetCopyrightHeader(FileType, AllowModify=3DFalse): >=20 > + FileDescription =3D { >=20 > + 'bsf': 'Boot Setting', >=20 > + 'dsc': 'Definition', >=20 > + 'dlt': 'Delta', >=20 > + 'inc': 'C Binary Blob', >=20 > + 'h': 'C Struct Header' >=20 > + } >=20 > + if FileType in ['bsf', 'dsc', 'dlt']: >=20 > + CommentChar =3D '#' >=20 > + else: >=20 > + CommentChar =3D '' >=20 > + Lines =3D __copyright_tmp__.split('\n') >=20 > + >=20 > + if AllowModify: >=20 > + Lines =3D [Line for Line in Lines if 'Please do NOT modify' not = in Line] >=20 > + >=20 > + CopyrightHdr =3D '\n'.join('%s%s' % ( >=20 > + CommentChar, Line) for Line in Lines)[:-1] + '\n' >=20 > + >=20 > + return CopyrightHdr % (FileDescription[FileType], date.today().year) >=20 > + >=20 > + >=20 > +class CLogicalExpression: >=20 > + def __init__(self): >=20 > + self.index =3D 0 >=20 > + self.string =3D '' >=20 > + >=20 > + def errExit(self, err=3D''): >=20 > + print("ERROR: Express parsing for:") >=20 > + print(" %s" % self.string) >=20 > + print(" %s^" % (' ' * self.index)) >=20 > + if err: >=20 > + print("INFO : %s" % err) >=20 > + raise SystemExit >=20 > + >=20 > + def getNonNumber(self, n1, n2): >=20 > + if not n1.isdigit(): >=20 > + return n1 >=20 > + if not n2.isdigit(): >=20 > + return n2 >=20 > + return None >=20 > + >=20 > + def getCurr(self, lens=3D1): >=20 > + try: >=20 > + if lens =3D=3D -1: >=20 > + return self.string[self.index:] >=20 > + else: >=20 > + if self.index + lens > len(self.string): >=20 > + lens =3D len(self.string) - self.index >=20 > + return self.string[self.index: self.index + lens] >=20 > + except Exception: >=20 > + return '' >=20 > + >=20 > + def isLast(self): >=20 > + return self.index =3D=3D len(self.string) >=20 > + >=20 > + def moveNext(self, len=3D1): >=20 > + self.index +=3D len >=20 > + >=20 > + def skipSpace(self): >=20 > + while not self.isLast(): >=20 > + if self.getCurr() in ' \t': >=20 > + self.moveNext() >=20 > + else: >=20 > + return >=20 > + >=20 > + def normNumber(self, val): >=20 > + return True if val else False >=20 > + >=20 > + def getNumber(self, var): >=20 > + var =3D var.strip() >=20 > + if re.match('^0x[a-fA-F0-9]+$', var): >=20 > + value =3D int(var, 16) >=20 > + elif re.match('^[+-]?\\d+$', var): >=20 > + value =3D int(var, 10) >=20 > + else: >=20 > + value =3D None >=20 > + return value >=20 > + >=20 > + def parseValue(self): >=20 > + self.skipSpace() >=20 > + var =3D '' >=20 > + while not self.isLast(): >=20 > + char =3D self.getCurr() >=20 > + if re.match('^[\\w.]', char): >=20 > + var +=3D char >=20 > + self.moveNext() >=20 > + else: >=20 > + break >=20 > + val =3D self.getNumber(var) >=20 > + if val is None: >=20 > + value =3D var >=20 > + else: >=20 > + value =3D "%d" % val >=20 > + return value >=20 > + >=20 > + def parseSingleOp(self): >=20 > + self.skipSpace() >=20 > + if re.match('^NOT\\W', self.getCurr(-1)): >=20 > + self.moveNext(3) >=20 > + op =3D self.parseBrace() >=20 > + val =3D self.getNumber(op) >=20 > + if val is None: >=20 > + self.errExit("'%s' is not a number" % op) >=20 > + return "%d" % (not self.normNumber(int(op))) >=20 > + else: >=20 > + return self.parseValue() >=20 > + >=20 > + def parseBrace(self): >=20 > + self.skipSpace() >=20 > + char =3D self.getCurr() >=20 > + if char =3D=3D '(': >=20 > + self.moveNext() >=20 > + value =3D self.parseExpr() >=20 > + self.skipSpace() >=20 > + if self.getCurr() !=3D ')': >=20 > + self.errExit("Expecting closing brace or operator") >=20 > + self.moveNext() >=20 > + return value >=20 > + else: >=20 > + value =3D self.parseSingleOp() >=20 > + return value >=20 > + >=20 > + def parseCompare(self): >=20 > + value =3D self.parseBrace() >=20 > + while True: >=20 > + self.skipSpace() >=20 > + char =3D self.getCurr() >=20 > + if char in ['<', '>']: >=20 > + self.moveNext() >=20 > + next =3D self.getCurr() >=20 > + if next =3D=3D '=3D': >=20 > + op =3D char + next >=20 > + self.moveNext() >=20 > + else: >=20 > + op =3D char >=20 > + result =3D self.parseBrace() >=20 > + test =3D self.getNonNumber(result, value) >=20 > + if test is None: >=20 > + value =3D "%d" % self.normNumber(eval(value + op + r= esult)) >=20 > + else: >=20 > + self.errExit("'%s' is not a valid number for compari= sion" >=20 > + % test) >=20 > + elif char in ['=3D', '!']: >=20 > + op =3D self.getCurr(2) >=20 > + if op in ['=3D=3D', '!=3D']: >=20 > + self.moveNext(2) >=20 > + result =3D self.parseBrace() >=20 > + test =3D self.getNonNumber(result, value) >=20 > + if test is None: >=20 > + value =3D "%d" % self.normNumber((eval(value + o= p >=20 > + + result))) >=20 > + else: >=20 > + value =3D "%d" % self.normNumber(eval("'" + valu= e + >=20 > + "'" + op + "= '" + >=20 > + result + "'"= )) >=20 > + else: >=20 > + break >=20 > + else: >=20 > + break >=20 > + return value >=20 > + >=20 > + def parseAnd(self): >=20 > + value =3D self.parseCompare() >=20 > + while True: >=20 > + self.skipSpace() >=20 > + if re.match('^AND\\W', self.getCurr(-1)): >=20 > + self.moveNext(3) >=20 > + result =3D self.parseCompare() >=20 > + test =3D self.getNonNumber(result, value) >=20 > + if test is None: >=20 > + value =3D "%d" % self.normNumber(int(value) & int(re= sult)) >=20 > + else: >=20 > + self.errExit("'%s' is not a valid op number for AND"= % >=20 > + test) >=20 > + else: >=20 > + break >=20 > + return value >=20 > + >=20 > + def parseOrXor(self): >=20 > + value =3D self.parseAnd() >=20 > + op =3D None >=20 > + while True: >=20 > + self.skipSpace() >=20 > + op =3D None >=20 > + if re.match('^XOR\\W', self.getCurr(-1)): >=20 > + self.moveNext(3) >=20 > + op =3D '^' >=20 > + elif re.match('^OR\\W', self.getCurr(-1)): >=20 > + self.moveNext(2) >=20 > + op =3D '|' >=20 > + else: >=20 > + break >=20 > + if op: >=20 > + result =3D self.parseAnd() >=20 > + test =3D self.getNonNumber(result, value) >=20 > + if test is None: >=20 > + value =3D "%d" % self.normNumber(eval(value + op + r= esult)) >=20 > + else: >=20 > + self.errExit("'%s' is not a valid op number for XOR/= OR" % >=20 > + test) >=20 > + return value >=20 > + >=20 > + def parseExpr(self): >=20 > + return self.parseOrXor() >=20 > + >=20 > + def getResult(self): >=20 > + value =3D self.parseExpr() >=20 > + self.skipSpace() >=20 > + if not self.isLast(): >=20 > + self.errExit("Unexpected character found '%s'" % self.getCur= r()) >=20 > + test =3D self.getNumber(value) >=20 > + if test is None: >=20 > + self.errExit("Result '%s' is not a number" % value) >=20 > + return int(value) >=20 > + >=20 > + def evaluateExpress(self, Expr): >=20 > + self.index =3D 0 >=20 > + self.string =3D Expr >=20 > + if self.getResult(): >=20 > + Result =3D True >=20 > + else: >=20 > + Result =3D False >=20 > + return Result >=20 > + >=20 > + >=20 > +class CFspBsf2Dsc: >=20 > + >=20 > + def __init__(self, bsf_file): >=20 > + self.cfg_list =3D CFspBsf2Dsc.parse_bsf(bsf_file) >=20 > + >=20 > + def get_dsc_lines(self): >=20 > + return CFspBsf2Dsc.generate_dsc(self.cfg_list) >=20 > + >=20 > + def save_dsc(self, dsc_file): >=20 > + return CFspBsf2Dsc.generate_dsc(self.cfg_list, dsc_file) >=20 > + >=20 > + @staticmethod >=20 > + def parse_bsf(bsf_file): >=20 > + >=20 > + fd =3D open(bsf_file, 'r') >=20 > + bsf_txt =3D fd.read() >=20 > + fd.close() >=20 > + >=20 > + find_list =3D [] >=20 > + regex =3D re.compile(r'\s+Find\s+"(.*?)"(.*?)^\s+\$(.*?)\s+', >=20 > + re.S | re.MULTILINE) >=20 > + for match in regex.finditer(bsf_txt): >=20 > + find =3D match.group(1) >=20 > + name =3D match.group(3) >=20 > + if not name.endswith('_Revision'): >=20 > + raise Exception("Unexpected CFG item following 'Find' !"= ) >=20 > + find_list.append((name, find)) >=20 > + >=20 > + idx =3D 0 >=20 > + count =3D 0 >=20 > + prefix =3D '' >=20 > + chk_dict =3D {} >=20 > + cfg_list =3D [] >=20 > + cfg_temp =3D {'find': '', 'cname': '', 'length': 0, 'value': '0'= , >=20 > + 'type': 'Reserved', >=20 > + 'embed': '', 'page': '', 'option': '', 'instance': 0= } >=20 > + regex =3D re.compile( >=20 > + r'^\s+(\$(.*?)|Skip)\s+(\d+)\s+bytes(\s+\$_DEFAULT_\s' >=20 > + r'+=3D\s+(.+?))?$', re.S | >=20 > + re.MULTILINE) >=20 > + >=20 > + for match in regex.finditer(bsf_txt): >=20 > + dlen =3D int(match.group(3)) >=20 > + if match.group(1) =3D=3D 'Skip': >=20 > + key =3D 'gPlatformFspPkgTokenSpaceGuid_BsfSkip%d' % idx >=20 > + val =3D ', '.join(['%02X' % ord(i) for i in '\x00' * dle= n]) >=20 > + idx +=3D 1 >=20 > + option =3D '$SKIP' >=20 > + else: >=20 > + key =3D match.group(2) >=20 > + val =3D match.group(5) >=20 > + option =3D '' >=20 > + >=20 > + cfg_item =3D dict(cfg_temp) >=20 > + finds =3D [i for i in find_list if i[0] =3D=3D key] >=20 > + if len(finds) > 0: >=20 > + if count >=3D 1: >=20 > + # Append a dummy one >=20 > + cfg_item['cname'] =3D 'Dummy' >=20 > + cfg_list.append(dict(cfg_item)) >=20 > + cfg_list[-1]['embed'] =3D '%s:TAG_%03X:END' % \ >=20 > + (prefix, ord(prefix[-1])) >=20 > + prefix =3D finds[0][1] >=20 > + cfg_item['embed'] =3D '%s:TAG_%03X:START' % \ >=20 > + (prefix, ord(prefix[-1])) >=20 > + cfg_item['find'] =3D prefix >=20 > + cfg_item['cname'] =3D 'Signature' >=20 > + cfg_item['length'] =3D len(finds[0][1]) >=20 > + cfg_item['value'] =3D '0x%X' % \ >=20 > + Bytes2Val(finds[0][1].encode('UTF-8')) >=20 > + >=20 > + cfg_list.append(dict(cfg_item)) >=20 > + cfg_item =3D dict(cfg_temp) >=20 > + find_list.pop(0) >=20 > + count =3D 0 >=20 > + >=20 > + cfg_item['cname'] =3D key >=20 > + cfg_item['length'] =3D dlen >=20 > + cfg_item['value'] =3D val >=20 > + cfg_item['option'] =3D option >=20 > + >=20 > + if key not in chk_dict.keys(): >=20 > + chk_dict[key] =3D 0 >=20 > + else: >=20 > + chk_dict[key] +=3D 1 >=20 > + cfg_item['instance'] =3D chk_dict[key] >=20 > + >=20 > + cfg_list.append(cfg_item) >=20 > + count +=3D 1 >=20 > + >=20 > + if prefix: >=20 > + cfg_item =3D dict(cfg_temp) >=20 > + cfg_item['cname'] =3D 'Dummy' >=20 > + cfg_item['embed'] =3D '%s:%03X:END' % (prefix, ord(prefix[-1= ])) >=20 > + cfg_list.append(cfg_item) >=20 > + >=20 > + option_dict =3D {} >=20 > + selreg =3D re.compile( >=20 > + r'\s+Selection\s*(.+?)\s*,\s*"(.*?)"$', re.S | >=20 > + re.MULTILINE) >=20 > + regex =3D re.compile( >=20 > + r'^List\s&(.+?)$(.+?)^EndList$', re.S | re.MULTILINE) >=20 > + for match in regex.finditer(bsf_txt): >=20 > + key =3D match.group(1) >=20 > + option_dict[key] =3D [] >=20 > + for select in selreg.finditer(match.group(2)): >=20 > + option_dict[key].append( >=20 > + (int(select.group(1), 0), select.group(2))) >=20 > + >=20 > + chk_dict =3D {} >=20 > + pagereg =3D re.compile( >=20 > + r'^Page\s"(.*?)"$(.+?)^EndPage$', re.S | re.MULTILINE) >=20 > + for match in pagereg.finditer(bsf_txt): >=20 > + page =3D match.group(1) >=20 > + for line in match.group(2).splitlines(): >=20 > + match =3D re.match( >=20 > + r'\s+(Combo|EditNum)\s\$(.+?),\s"(.*?)",\s(.+?),$', = line) >=20 > + if match: >=20 > + cname =3D match.group(2) >=20 > + if cname not in chk_dict.keys(): >=20 > + chk_dict[cname] =3D 0 >=20 > + else: >=20 > + chk_dict[cname] +=3D 1 >=20 > + instance =3D chk_dict[cname] >=20 > + cfg_idxs =3D [i for i, j in enumerate(cfg_list) >=20 > + if j['cname'] =3D=3D cname and >=20 > + j['instance'] =3D=3D instance] >=20 > + if len(cfg_idxs) !=3D 1: >=20 > + raise Exception( >=20 > + "Multiple CFG item '%s' found !" % cname) >=20 > + cfg_item =3D cfg_list[cfg_idxs[0]] >=20 > + cfg_item['page'] =3D page >=20 > + cfg_item['type'] =3D match.group(1) >=20 > + cfg_item['prompt'] =3D match.group(3) >=20 > + cfg_item['range'] =3D None >=20 > + if cfg_item['type'] =3D=3D 'Combo': >=20 > + cfg_item['option'] =3D option_dict[match.group(4= )[1:]] >=20 > + elif cfg_item['type'] =3D=3D 'EditNum': >=20 > + cfg_item['option'] =3D match.group(4) >=20 > + match =3D re.match(r'\s+ Help\s"(.*?)"$', line) >=20 > + if match: >=20 > + cfg_item['help'] =3D match.group(1) >=20 > + >=20 > + match =3D re.match(r'\s+"Valid\srange:\s(.*)"$', line) >=20 > + if match: >=20 > + parts =3D match.group(1).split() >=20 > + cfg_item['option'] =3D ( >=20 > + (int(parts[0], 0), int(parts[2], 0), >=20 > + cfg_item['option'])) >=20 > + >=20 > + return cfg_list >=20 > + >=20 > + @staticmethod >=20 > + def generate_dsc(option_list, dsc_file=3DNone): >=20 > + dsc_lines =3D [] >=20 > + header =3D '%s' % (__copyright_dsc__ % date.today().year) >=20 > + dsc_lines.extend(header.splitlines()) >=20 > + >=20 > + pages =3D [] >=20 > + for cfg_item in option_list: >=20 > + if cfg_item['page'] and (cfg_item['page'] not in pages): >=20 > + pages.append(cfg_item['page']) >=20 > + >=20 > + page_id =3D 0 >=20 > + for page in pages: >=20 > + dsc_lines.append(' # !BSF PAGES:{PG%02X::"%s"}' % (page_id,= page)) >=20 > + page_id +=3D 1 >=20 > + dsc_lines.append('') >=20 > + >=20 > + last_page =3D '' >=20 > + for option in option_list: >=20 > + dsc_lines.append('') >=20 > + default =3D option['value'] >=20 > + pos =3D option['cname'].find('_') >=20 > + name =3D option['cname'][pos + 1:] >=20 > + >=20 > + if option['find']: >=20 > + dsc_lines.append(' # !BSF FIND:{%s}' % option['find']) >=20 > + dsc_lines.append('') >=20 > + >=20 > + if option['instance'] > 0: >=20 > + name =3D name + '_%s' % option['instance'] >=20 > + >=20 > + if option['embed']: >=20 > + dsc_lines.append(' # !HDR EMBED:{%s}' % option['embed']= ) >=20 > + >=20 > + if option['type'] =3D=3D 'Reserved': >=20 > + dsc_lines.append(' # !BSF NAME:{Reserved} TYPE:{Reserve= d}') >=20 > + if option['option'] =3D=3D '$SKIP': >=20 > + dsc_lines.append(' # !BSF OPTION:{$SKIP}') >=20 > + else: >=20 > + prompt =3D option['prompt'] >=20 > + >=20 > + if last_page !=3D option['page']: >=20 > + last_page =3D option['page'] >=20 > + dsc_lines.append(' # !BSF PAGE:{PG%02X}' % >=20 > + (pages.index(option['page']))) >=20 > + >=20 > + if option['type'] =3D=3D 'Combo': >=20 > + dsc_lines.append(' # !BSF NAME:{%s} TYPE:{%s}' % >=20 > + (prompt, option['type'])) >=20 > + ops =3D [] >=20 > + for val, text in option['option']: >=20 > + ops.append('0x%x:%s' % (val, text)) >=20 > + dsc_lines.append(' # !BSF OPTION:{%s}' % (', '.join= (ops))) >=20 > + elif option['type'] =3D=3D 'EditNum': >=20 > + cfg_len =3D option['length'] >=20 > + if ',' in default and cfg_len > 8: >=20 > + dsc_lines.append(' # !BSF NAME:{%s} TYPE:{Table= }' % >=20 > + (prompt)) >=20 > + if cfg_len > 16: >=20 > + cfg_len =3D 16 >=20 > + ops =3D [] >=20 > + for i in range(cfg_len): >=20 > + ops.append('%X:1:HEX' % i) >=20 > + dsc_lines.append(' # !BSF OPTION:{%s}' % >=20 > + (', '.join(ops))) >=20 > + else: >=20 > + dsc_lines.append( >=20 > + ' # !BSF NAME:{%s} TYPE:{%s, %s, (0x%X, 0x%= X)}' % >=20 > + (prompt, option['type'], option['option'][2]= , >=20 > + option['option'][0], option['option'][1])) >=20 > + dsc_lines.append(' # !BSF HELP:{%s}' % option['help']) >=20 > + >=20 > + if ',' in default: >=20 > + default =3D '{%s}' % default >=20 > + dsc_lines.append(' gCfgData.%-30s | * | 0x%04X | %s' % >=20 > + (name, option['length'], default)) >=20 > + >=20 > + if dsc_file: >=20 > + fd =3D open(dsc_file, 'w') >=20 > + fd.write('\n'.join(dsc_lines)) >=20 > + fd.close() >=20 > + >=20 > + return dsc_lines >=20 > + >=20 > + >=20 > +class CGenCfgData: >=20 > + def __init__(self, Mode=3D''): >=20 > + self.Debug =3D False >=20 > + self.Error =3D '' >=20 > + self.ReleaseMode =3D True >=20 > + self.Mode =3D Mode >=20 > + self._GlobalDataDef =3D """ >=20 > +GlobalDataDef >=20 > + SKUID =3D 0, "DEFAULT" >=20 > +EndGlobalData >=20 > + >=20 > +""" >=20 > + self._BuidinOptionTxt =3D """ >=20 > +List &EN_DIS >=20 > + Selection 0x1 , "Enabled" >=20 > + Selection 0x0 , "Disabled" >=20 > +EndList >=20 > + >=20 > +""" >=20 > + self._StructType =3D ['UINT8', 'UINT16', 'UINT32', 'UINT64'] >=20 > + self._BsfKeyList =3D ['FIND', 'NAME', 'HELP', 'TYPE', 'PAGE', 'P= AGES', >=20 > + 'BLOCK', 'OPTION', 'CONDITION', 'ORDER', 'MA= RKER', >=20 > + 'SUBT'] >=20 > + self._HdrKeyList =3D ['HEADER', 'STRUCT', 'EMBED', 'COMMENT'] >=20 > + self._BuidinOption =3D {'$EN_DIS': 'EN_DIS'} >=20 > + >=20 > + self._MacroDict =3D {} >=20 > + self._VarDict =3D {} >=20 > + self._PcdsDict =3D {} >=20 > + self._CfgBlkDict =3D {} >=20 > + self._CfgPageDict =3D {} >=20 > + self._CfgOptsDict =3D {} >=20 > + self._BsfTempDict =3D {} >=20 > + self._CfgItemList =3D [] >=20 > + self._DscLines =3D [] >=20 > + self._DscFile =3D '' >=20 > + self._CfgPageTree =3D {} >=20 > + >=20 > + self._MapVer =3D 0 >=20 > + self._MinCfgTagId =3D 0x100 >=20 > + >=20 > + def ParseMacros(self, MacroDefStr): >=20 > + # ['-DABC=3D1', '-D', 'CFG_DEBUG=3D1', '-D', 'CFG_OUTDIR=3DBuild= '] >=20 > + self._MacroDict =3D {} >=20 > + IsExpression =3D False >=20 > + for Macro in MacroDefStr: >=20 > + if Macro.startswith('-D'): >=20 > + IsExpression =3D True >=20 > + if len(Macro) > 2: >=20 > + Macro =3D Macro[2:] >=20 > + else: >=20 > + continue >=20 > + if IsExpression: >=20 > + IsExpression =3D False >=20 > + Match =3D re.match("(\\w+)=3D(.+)", Macro) >=20 > + if Match: >=20 > + self._MacroDict[Match.group(1)] =3D Match.group(2) >=20 > + else: >=20 > + Match =3D re.match("(\\w+)", Macro) >=20 > + if Match: >=20 > + self._MacroDict[Match.group(1)] =3D '' >=20 > + if len(self._MacroDict) =3D=3D 0: >=20 > + Error =3D 1 >=20 > + else: >=20 > + Error =3D 0 >=20 > + if self.Debug: >=20 > + print("INFO : Macro dictionary:") >=20 > + for Each in self._MacroDict: >=20 > + print(" $(%s) =3D [ %s ]" % (Each, >=20 > + self._MacroDict[Eac= h])) >=20 > + return Error >=20 > + >=20 > + def EvaulateIfdef(self, Macro): >=20 > + Result =3D Macro in self._MacroDict >=20 > + if self.Debug: >=20 > + print("INFO : Eval Ifdef [%s] : %s" % (Macro, Result)) >=20 > + return Result >=20 > + >=20 > + def ExpandMacros(self, Input, Preserve=3DFalse): >=20 > + Line =3D Input >=20 > + Match =3D re.findall("\\$\\(\\w+\\)", Input) >=20 > + if Match: >=20 > + for Each in Match: >=20 > + Variable =3D Each[2:-1] >=20 > + if Variable in self._MacroDict: >=20 > + Line =3D Line.replace(Each, self._MacroDict[Variable= ]) >=20 > + else: >=20 > + if self.Debug: >=20 > + print("WARN : %s is not defined" % Each) >=20 > + if not Preserve: >=20 > + Line =3D Line.replace(Each, Each[2:-1]) >=20 > + return Line >=20 > + >=20 > + def ExpandPcds(self, Input): >=20 > + Line =3D Input >=20 > + Match =3D re.findall("(\\w+\\.\\w+)", Input) >=20 > + if Match: >=20 > + for PcdName in Match: >=20 > + if PcdName in self._PcdsDict: >=20 > + Line =3D Line.replace(PcdName, self._PcdsDict[PcdNam= e]) >=20 > + else: >=20 > + if self.Debug: >=20 > + print("WARN : %s is not defined" % PcdName) >=20 > + return Line >=20 > + >=20 > + def EvaluateExpress(self, Expr): >=20 > + ExpExpr =3D self.ExpandPcds(Expr) >=20 > + ExpExpr =3D self.ExpandMacros(ExpExpr) >=20 > + LogExpr =3D CLogicalExpression() >=20 > + Result =3D LogExpr.evaluateExpress(ExpExpr) >=20 > + if self.Debug: >=20 > + print("INFO : Eval Express [%s] : %s" % (Expr, Result)) >=20 > + return Result >=20 > + >=20 > + def ValueToByteArray(self, ValueStr, Length): >=20 > + Match =3D re.match("\\{\\s*FILE:(.+)\\}", ValueStr) >=20 > + if Match: >=20 > + FileList =3D Match.group(1).split(',') >=20 > + Result =3D bytearray() >=20 > + for File in FileList: >=20 > + File =3D File.strip() >=20 > + BinPath =3D os.path.join(os.path.dirname(self._DscFile),= File) >=20 > + Result.extend(bytearray(open(BinPath, 'rb').read())) >=20 > + else: >=20 > + try: >=20 > + Result =3D bytearray(self.ValueToList(ValueStr, Length)) >=20 > + except ValueError: >=20 > + raise Exception("Bytes in '%s' must be in range 0~255 !"= % >=20 > + ValueStr) >=20 > + if len(Result) < Length: >=20 > + Result.extend(b'\x00' * (Length - len(Result))) >=20 > + elif len(Result) > Length: >=20 > + raise Exception("Value '%s' is too big to fit into %d bytes = !" % >=20 > + (ValueStr, Length)) >=20 > + >=20 > + return Result[:Length] >=20 > + >=20 > + def ValueToList(self, ValueStr, Length): >=20 > + if ValueStr[0] =3D=3D '{': >=20 > + Result =3D [] >=20 > + BinList =3D ValueStr[1:-1].split(',') >=20 > + InBitField =3D False >=20 > + LastInBitField =3D False >=20 > + Value =3D 0 >=20 > + BitLen =3D 0 >=20 > + for Element in BinList: >=20 > + InBitField =3D False >=20 > + Each =3D Element.strip() >=20 > + if len(Each) =3D=3D 0: >=20 > + pass >=20 > + else: >=20 > + if Each[0] in ['"', "'"]: >=20 > + Result.extend(list(bytearray(Each[1:-1], 'utf-8'= ))) >=20 > + elif ':' in Each: >=20 > + Match =3D re.match("(.+):(\\d+)b", Each) >=20 > + if Match is None: >=20 > + raise Exception("Invald value list format '%= s' !" >=20 > + % Each) >=20 > + InBitField =3D True >=20 > + CurrentBitLen =3D int(Match.group(2)) >=20 > + CurrentValue =3D ((self.EvaluateExpress(Match.gr= oup(1)) >=20 > + & (1 << CurrentBitLen) - 1)) <<= BitLen >=20 > + else: >=20 > + Result.append(self.EvaluateExpress(Each.strip())= ) >=20 > + if InBitField: >=20 > + Value +=3D CurrentValue >=20 > + BitLen +=3D CurrentBitLen >=20 > + if LastInBitField and ((not InBitField) or (Element =3D= =3D >=20 > + BinList[-1])= ): >=20 > + if BitLen % 8 !=3D 0: >=20 > + raise Exception("Invald bit field length!") >=20 > + Result.extend(Val2Bytes(Value, BitLen // 8)) >=20 > + Value =3D 0 >=20 > + BitLen =3D 0 >=20 > + LastInBitField =3D InBitField >=20 > + elif ValueStr.startswith("'") and ValueStr.endswith("'"): >=20 > + Result =3D Str2Bytes(ValueStr, Length) >=20 > + elif ValueStr.startswith('"') and ValueStr.endswith('"'): >=20 > + Result =3D Str2Bytes(ValueStr, Length) >=20 > + else: >=20 > + Result =3D Val2Bytes(self.EvaluateExpress(ValueStr), Length) >=20 > + return Result >=20 > + >=20 > + def FormatDeltaValue(self, ConfigDict): >=20 > + ValStr =3D ConfigDict['value'] >=20 > + if ValStr[0] =3D=3D "'": >=20 > + # Remove padding \x00 in the value string >=20 > + ValStr =3D "'%s'" % ValStr[1:-1].rstrip('\x00') >=20 > + >=20 > + Struct =3D ConfigDict['struct'] >=20 > + if Struct in self._StructType: >=20 > + # Format the array using its struct type >=20 > + Unit =3D int(Struct[4:]) // 8 >=20 > + Value =3D Array2Val(ConfigDict['value']) >=20 > + Loop =3D ConfigDict['length'] // Unit >=20 > + Values =3D [] >=20 > + for Each in range(Loop): >=20 > + Values.append(Value & ((1 << (Unit * 8)) - 1)) >=20 > + Value =3D Value >> (Unit * 8) >=20 > + ValStr =3D '{ ' + ', '.join([('0x%%0%dX' % (Unit * 2)) % >=20 > + x for x in Values]) + ' }' >=20 > + >=20 > + return ValStr >=20 > + >=20 > + def FormatListValue(self, ConfigDict): >=20 > + Struct =3D ConfigDict['struct'] >=20 > + if Struct not in self._StructType: >=20 > + return >=20 > + >=20 > + DataList =3D self.ValueToList(ConfigDict['value'], ConfigDict['l= ength']) >=20 > + Unit =3D int(Struct[4:]) // 8 >=20 > + if int(ConfigDict['length']) !=3D Unit * len(DataList): >=20 > + # Fallback to byte array >=20 > + Unit =3D 1 >=20 > + if int(ConfigDict['length']) !=3D len(DataList): >=20 > + raise Exception("Array size is not proper for '%s' !" % >=20 > + ConfigDict['cname']) >=20 > + >=20 > + ByteArray =3D [] >=20 > + for Value in DataList: >=20 > + for Loop in range(Unit): >=20 > + ByteArray.append("0x%02X" % (Value & 0xFF)) >=20 > + Value =3D Value >> 8 >=20 > + NewValue =3D '{' + ','.join(ByteArray) + '}' >=20 > + ConfigDict['value'] =3D NewValue >=20 > + >=20 > + return "" >=20 > + >=20 > + def GetOrderNumber(self, Offset, Order, BitOff=3D0): >=20 > + if isinstance(Order, int): >=20 > + if Order =3D=3D -1: >=20 > + Order =3D Offset << 16 >=20 > + else: >=20 > + (Major, Minor) =3D Order.split('.') >=20 > + Order =3D (int(Major, 16) << 16) + ((int(Minor, 16) & 0xFF) = << 8) >=20 > + return Order + (BitOff & 0xFF) >=20 > + >=20 > + def SubtituteLine(self, Line, Args): >=20 > + Args =3D Args.strip() >=20 > + Vars =3D Args.split(':') >=20 > + Line =3D self.ExpandMacros(Line, True) >=20 > + for Idx in range(len(Vars)-1, 0, -1): >=20 > + Line =3D Line.replace('$(%d)' % Idx, Vars[Idx].strip()) >=20 > + return Line >=20 > + >=20 > + def CfgDuplicationCheck(self, CfgDict, Name): >=20 > + if not self.Debug: >=20 > + return >=20 > + >=20 > + if Name =3D=3D 'Dummy': >=20 > + return >=20 > + >=20 > + if Name not in CfgDict: >=20 > + CfgDict[Name] =3D 1 >=20 > + else: >=20 > + print("WARNING: Duplicated item found '%s' !" % >=20 > + CfgDict['cname']) >=20 > + >=20 > + def AddBsfChildPage(self, Child, Parent=3D'root'): >=20 > + def AddBsfChildPageRecursive(PageTree, Parent, Child): >=20 > + Key =3D next(iter(PageTree)) >=20 > + if Parent =3D=3D Key: >=20 > + PageTree[Key].append({Child: []}) >=20 > + return True >=20 > + else: >=20 > + Result =3D False >=20 > + for Each in PageTree[Key]: >=20 > + if AddBsfChildPageRecursive(Each, Parent, Child): >=20 > + Result =3D True >=20 > + break >=20 > + return Result >=20 > + >=20 > + return AddBsfChildPageRecursive(self._CfgPageTree, Parent, Child= ) >=20 > + >=20 > + def ParseDscFile(self, DscFile): >=20 > + self._DscLines =3D [] >=20 > + self._CfgItemList =3D [] >=20 > + self._CfgPageDict =3D {} >=20 > + self._CfgBlkDict =3D {} >=20 > + self._BsfTempDict =3D {} >=20 > + self._CfgPageTree =3D {'root': []} >=20 > + >=20 > + CfgDict =3D {} >=20 > + >=20 > + SectionNameList =3D ["Defines".lower(), "PcdsFeatureFlag".lower(= ), >=20 > + "PcdsDynamicVpd.Tmp".lower(), >=20 > + "PcdsDynamicVpd.Upd".lower()] >=20 > + >=20 > + IsDefSect =3D False >=20 > + IsPcdSect =3D False >=20 > + IsUpdSect =3D False >=20 > + IsTmpSect =3D False >=20 > + >=20 > + TemplateName =3D '' >=20 > + >=20 > + IfStack =3D [] >=20 > + ElifStack =3D [] >=20 > + Error =3D 0 >=20 > + ConfigDict =3D {} >=20 > + >=20 > + if type(DscFile) is list: >=20 > + # it is DSC lines already >=20 > + DscLines =3D DscFile >=20 > + self._DscFile =3D '.' >=20 > + else: >=20 > + DscFd =3D open(DscFile, "r") >=20 > + DscLines =3D DscFd.readlines() >=20 > + DscFd.close() >=20 > + self._DscFile =3D DscFile >=20 > + >=20 > + BsfRegExp =3D re.compile("(%s):{(.+?)}(?:$|\\s+)" % '|'. >=20 > + join(self._BsfKeyList)) >=20 > + HdrRegExp =3D re.compile("(%s):{(.+?)}" % '|'.join(self._HdrKeyL= ist)) >=20 > + CfgRegExp =3D re.compile("^([_a-zA-Z0-9]+)\\s*\\|\\s*\ >=20 > +(0x[0-9A-F]+|\\*)\\s*\\|\\s*(\\d+|0x[0-9a-fA-F]+)\\s*\\|\\s*(.+)") >=20 > + TksRegExp =3D re.compile("^(g[_a-zA-Z0-9]+\\.)(.+)") >=20 > + SkipLines =3D 0 >=20 > + while len(DscLines): >=20 > + DscLine =3D DscLines.pop(0).strip() >=20 > + if SkipLines =3D=3D 0: >=20 > + self._DscLines.append(DscLine) >=20 > + else: >=20 > + SkipLines =3D SkipLines - 1 >=20 > + if len(DscLine) =3D=3D 0: >=20 > + continue >=20 > + >=20 > + Handle =3D False >=20 > + Match =3D re.match("^\\[(.+)\\]", DscLine) >=20 > + if Match is not None: >=20 > + IsDefSect =3D False >=20 > + IsPcdSect =3D False >=20 > + IsUpdSect =3D False >=20 > + IsTmpSect =3D False >=20 > + SectionName =3D Match.group(1).lower() >=20 > + if SectionName =3D=3D SectionNameList[0]: >=20 > + IsDefSect =3D True >=20 > + if SectionName =3D=3D SectionNameList[1]: >=20 > + IsPcdSect =3D True >=20 > + elif SectionName =3D=3D SectionNameList[2]: >=20 > + IsTmpSect =3D True >=20 > + elif SectionName =3D=3D SectionNameList[3]: >=20 > + ConfigDict =3D { >=20 > + 'header': 'ON', >=20 > + 'page': '', >=20 > + 'name': '', >=20 > + 'find': '', >=20 > + 'struct': '', >=20 > + 'embed': '', >=20 > + 'marker': '', >=20 > + 'option': '', >=20 > + 'comment': '', >=20 > + 'condition': '', >=20 > + 'order': -1, >=20 > + 'subreg': [] >=20 > + } >=20 > + IsUpdSect =3D True >=20 > + Offset =3D 0 >=20 > + else: >=20 > + if IsDefSect or IsPcdSect or IsUpdSect or IsTmpSect: >=20 > + Match =3D False if DscLine[0] !=3D '!' else True >=20 > + if Match: >=20 > + Match =3D re.match("^!(else|endif|ifdef|ifndef|i= f|elseif\ >=20 > +|include)\\s*(.+)?$", DscLine.split("#")[0]) >=20 > + Keyword =3D Match.group(1) if Match else '' >=20 > + Remaining =3D Match.group(2) if Match else '' >=20 > + Remaining =3D '' if Remaining is None else Remaining= .strip() >=20 > + >=20 > + if Keyword in ['if', 'elseif', 'ifdef', 'ifndef', 'i= nclude' >=20 > + ] and not Remaining: >=20 > + raise Exception("ERROR: Expression is expected a= fter \ >=20 > +'!if' or !elseif' for line '%s'" % DscLine) >=20 > + >=20 > + if Keyword =3D=3D 'else': >=20 > + if IfStack: >=20 > + IfStack[-1] =3D not IfStack[-1] >=20 > + else: >=20 > + raise Exception("ERROR: No paired '!if' foun= d for \ >=20 > +'!else' for line '%s'" % DscLine) >=20 > + elif Keyword =3D=3D 'endif': >=20 > + if IfStack: >=20 > + IfStack.pop() >=20 > + Level =3D ElifStack.pop() >=20 > + if Level > 0: >=20 > + del IfStack[-Level:] >=20 > + else: >=20 > + raise Exception("ERROR: No paired '!if' foun= d for \ >=20 > +'!endif' for line '%s'" % DscLine) >=20 > + elif Keyword =3D=3D 'ifdef' or Keyword =3D=3D 'ifnde= f': >=20 > + Result =3D self.EvaulateIfdef(Remaining) >=20 > + if Keyword =3D=3D 'ifndef': >=20 > + Result =3D not Result >=20 > + IfStack.append(Result) >=20 > + ElifStack.append(0) >=20 > + elif Keyword =3D=3D 'if' or Keyword =3D=3D 'elseif': >=20 > + Result =3D self.EvaluateExpress(Remaining) >=20 > + if Keyword =3D=3D "if": >=20 > + ElifStack.append(0) >=20 > + IfStack.append(Result) >=20 > + else: # elseif >=20 > + if IfStack: >=20 > + IfStack[-1] =3D not IfStack[-1] >=20 > + IfStack.append(Result) >=20 > + ElifStack[-1] =3D ElifStack[-1] + 1 >=20 > + else: >=20 > + raise Exception("ERROR: No paired '!if' = found for \ >=20 > +'!elif' for line '%s'" % DscLine) >=20 > + else: >=20 > + if IfStack: >=20 > + Handle =3D reduce(lambda x, y: x and y, IfSt= ack) >=20 > + else: >=20 > + Handle =3D True >=20 > + if Handle: >=20 > + if Keyword =3D=3D 'include': >=20 > + Remaining =3D self.ExpandMacros(Remainin= g) >=20 > + # Relative to DSC filepath >=20 > + IncludeFilePath =3D os.path.join( >=20 > + os.path.dirname(self._DscFile), Rema= ining) >=20 > + if not os.path.exists(IncludeFilePath): >=20 > + # Relative to repository to find \ >=20 > + # dsc in common platform >=20 > + IncludeFilePath =3D os.path.join( >=20 > + os.path.dirname(self._DscFile), = "..", >=20 > + Remaining) >=20 > + >=20 > + try: >=20 > + IncludeDsc =3D open(IncludeFilePath,= "r") >=20 > + except Exception: >=20 > + raise Exception("ERROR: Cannot open = \ >=20 > +file '%s'." % IncludeFilePath) >=20 > + NewDscLines =3D IncludeDsc.readlines() >=20 > + IncludeDsc.close() >=20 > + DscLines =3D NewDscLines + DscLines >=20 > + del self._DscLines[-1] >=20 > + else: >=20 > + if DscLine.startswith('!'): >=20 > + raise Exception("ERROR: Unrecoginize= d \ >=20 > +directive for line '%s'" % DscLine) >=20 > + >=20 > + if not Handle: >=20 > + del self._DscLines[-1] >=20 > + continue >=20 > + >=20 > + if IsDefSect: >=20 > + Match =3D re.match("^\\s*(?:DEFINE\\s+)*(\\w+)\\s*=3D\\s= *(.+)", >=20 > + DscLine) >=20 > + if Match: >=20 > + self._MacroDict[Match.group(1)] =3D Match.group(2) >=20 > + if self.Debug: >=20 > + print("INFO : DEFINE %s =3D [ %s ]" % (Match.gro= up(1), >=20 > + Match.group= (2))) >=20 > + >=20 > + elif IsPcdSect: >=20 > + Match =3D re.match("^\\s*([\\w\\.]+)\\s*\\|\\s*(\\w+)", = DscLine) >=20 > + if Match: >=20 > + self._PcdsDict[Match.group(1)] =3D Match.group(2) >=20 > + if self.Debug: >=20 > + print("INFO : PCD %s =3D [ %s ]" % (Match.group(= 1), >=20 > + Match.group(2)= )) >=20 > + >=20 > + elif IsTmpSect: >=20 > + # !BSF DEFT:{GPIO_TMPL:START} >=20 > + Match =3D re.match("^\\s*#\\s+(!BSF)\\s+DEFT:{(.+?):\ >=20 > +(START|END)}", DscLine) >=20 > + if Match: >=20 > + if Match.group(3) =3D=3D 'START' and not TemplateNam= e: >=20 > + TemplateName =3D Match.group(2).strip() >=20 > + self._BsfTempDict[TemplateName] =3D [] >=20 > + if Match.group(3) =3D=3D 'END' and ( >=20 > + TemplateName =3D=3D Match.group(2).strip() >=20 > + ) and TemplateName: >=20 > + TemplateName =3D '' >=20 > + else: >=20 > + if TemplateName: >=20 > + Match =3D re.match("^!include\\s*(.+)?$", DscLin= e) >=20 > + if Match: >=20 > + continue >=20 > + self._BsfTempDict[TemplateName].append(DscLine) >=20 > + >=20 > + else: >=20 > + Match =3D re.match("^\\s*#\\s+(!BSF|!HDR)\\s+(.+)", DscL= ine) >=20 > + if Match: >=20 > + Remaining =3D Match.group(2) >=20 > + if Match.group(1) =3D=3D '!BSF': >=20 > + Result =3D BsfRegExp.findall(Remaining) >=20 > + if Result: >=20 > + for Each in Result: >=20 > + Key =3D Each[0] >=20 > + Remaining =3D Each[1] >=20 > + >=20 > + if Key =3D=3D 'BLOCK': >=20 > + Match =3D re.match( >=20 > + "NAME:\"(.+)\"\\s*,\\s*\ >=20 > +VER:\"(.+)\"\\s*", Remaining) >=20 > + if Match: >=20 > + self._CfgBlkDict['name'] =3D \ >=20 > + Match.g= roup(1) >=20 > + self._CfgBlkDict['ver'] =3D Matc= h.group(2 >=20 > + = ) >=20 > + >=20 > + elif Key =3D=3D 'SUBT': >=20 > + # GPIO_TMPL:1:2:3 >=20 > + Remaining =3D Remaining.strip() >=20 > + Match =3D re.match("(\\w+)\\s*:", Re= maining) >=20 > + if Match: >=20 > + TemplateName =3D Match.group(1) >=20 > + for Line in self._BsfTempDict[ >=20 > + TemplateName][::-1]: >=20 > + NewLine =3D self.SubtituteLi= ne( >=20 > + Line, Remaining) >=20 > + DscLines.insert(0, NewLine) >=20 > + SkipLines +=3D 1 >=20 > + >=20 > + elif Key =3D=3D 'PAGES': >=20 > + # !BSF PAGES:{HSW:"Haswell System Ag= ent", \ >=20 > + # LPT:"Lynx Point PCH"} >=20 > + PageList =3D Remaining.split(',') >=20 > + for Page in PageList: >=20 > + Page =3D Page.strip() >=20 > + Match =3D re.match('(\\w+):\ >=20 > +(\\w*:)?\\"(.+)\\"', Page) >=20 > + if Match: >=20 > + PageName =3D Match.group(1) >=20 > + ParentName =3D Match.group(2= ) >=20 > + if not ParentName or \ >=20 > + ParentName =3D=3D ':': >=20 > + ParentName =3D 'root' >=20 > + else: >=20 > + ParentName =3D ParentNam= e[:-1] >=20 > + if not self.AddBsfChildPage( >=20 > + PageName, ParentName)= : >=20 > + raise Exception("Cannot = find \ >=20 > +parent page '%s'!" % ParentName) >=20 > + self._CfgPageDict[ >=20 > + PageName] =3D Match.grou= p(3) >=20 > + else: >=20 > + raise Exception("Invalid pag= e \ >=20 > +definitions '%s'!" % Page) >=20 > + >=20 > + elif Key in ['NAME', 'HELP', 'OPTION' >=20 > + ] and Remaining.startswith(= '+'): >=20 > + # Allow certain options to be extend= ed \ >=20 > + # to multiple lines >=20 > + ConfigDict[Key.lower()] +=3D Remaini= ng[1:] >=20 > + >=20 > + else: >=20 > + if Key =3D=3D 'NAME': >=20 > + Remaining =3D Remaining.strip() >=20 > + elif Key =3D=3D 'CONDITION': >=20 > + Remaining =3D self.ExpandMacros( >=20 > + Remaining.strip()) >=20 > + ConfigDict[Key.lower()] =3D Remainin= g >=20 > + else: >=20 > + Match =3D HdrRegExp.match(Remaining) >=20 > + if Match: >=20 > + Key =3D Match.group(1) >=20 > + Remaining =3D Match.group(2) >=20 > + if Key =3D=3D 'EMBED': >=20 > + Parts =3D Remaining.split(':') >=20 > + Names =3D Parts[0].split(',') >=20 > + DummyDict =3D ConfigDict.copy() >=20 > + if len(Names) > 1: >=20 > + Remaining =3D Names[0] + ':' + ':'.j= oin( >=20 > + Parts[1:]) >=20 > + DummyDict['struct'] =3D Names[1] >=20 > + else: >=20 > + DummyDict['struct'] =3D Names[0] >=20 > + DummyDict['cname'] =3D 'Dummy' >=20 > + DummyDict['name'] =3D '' >=20 > + DummyDict['embed'] =3D Remaining >=20 > + DummyDict['offset'] =3D Offset >=20 > + DummyDict['length'] =3D 0 >=20 > + DummyDict['value'] =3D '0' >=20 > + DummyDict['type'] =3D 'Reserved' >=20 > + DummyDict['help'] =3D '' >=20 > + DummyDict['subreg'] =3D [] >=20 > + self._CfgItemList.append(DummyDict) >=20 > + else: >=20 > + ConfigDict[Key.lower()] =3D Remaining >=20 > + # Check CFG line >=20 > + # gCfgData.VariableName | * | 0x01 | 0x1 >=20 > + Clear =3D False >=20 > + >=20 > + Match =3D TksRegExp.match(DscLine) >=20 > + if Match: >=20 > + DscLine =3D 'gCfgData.%s' % Match.group(2) >=20 > + >=20 > + if DscLine.startswith('gCfgData.'): >=20 > + Match =3D CfgRegExp.match(DscLine[9:]) >=20 > + else: >=20 > + Match =3D None >=20 > + if Match: >=20 > + ConfigDict['space'] =3D 'gCfgData' >=20 > + ConfigDict['cname'] =3D Match.group(1) >=20 > + if Match.group(2) !=3D '*': >=20 > + Offset =3D int(Match.group(2), 16) >=20 > + ConfigDict['offset'] =3D Offset >=20 > + ConfigDict['order'] =3D self.GetOrderNumber( >=20 > + ConfigDict['offset'], ConfigDict['order']) >=20 > + >=20 > + Value =3D Match.group(4).strip() >=20 > + if Match.group(3).startswith("0x"): >=20 > + Length =3D int(Match.group(3), 16) >=20 > + else: >=20 > + Length =3D int(Match.group(3)) >=20 > + >=20 > + Offset +=3D Length >=20 > + >=20 > + ConfigDict['length'] =3D Length >=20 > + Match =3D re.match("\\$\\((\\w+)\\)", Value) >=20 > + if Match: >=20 > + if Match.group(1) in self._MacroDict: >=20 > + Value =3D self._MacroDict[Match.group(1)] >=20 > + >=20 > + ConfigDict['value'] =3D Value >=20 > + if re.match("\\{\\s*FILE:(.+)\\}", Value): >=20 > + # Expand embedded binary file >=20 > + ValArray =3D self.ValueToByteArray(ConfigDict['v= alue'], >=20 > + ConfigDict['len= gth']) >=20 > + NewValue =3D Bytes2Str(ValArray) >=20 > + self._DscLines[-1] =3D re.sub(r'(.*)(\{\s*FILE:.= +\})', >=20 > + r'\1 %s' % NewValue, >=20 > + self._DscLines[-1]) >=20 > + ConfigDict['value'] =3D NewValue >=20 > + >=20 > + if ConfigDict['name'] =3D=3D '': >=20 > + # Clear BSF specific items >=20 > + ConfigDict['bsfname'] =3D '' >=20 > + ConfigDict['help'] =3D '' >=20 > + ConfigDict['type'] =3D '' >=20 > + ConfigDict['option'] =3D '' >=20 > + >=20 > + self.CfgDuplicationCheck(CfgDict, ConfigDict['cname'= ]) >=20 > + self._CfgItemList.append(ConfigDict.copy()) >=20 > + Clear =3D True >=20 > + >=20 > + else: >=20 > + # It could be a virtual item as below >=20 > + # !BSF FIELD:{SerialDebugPortAddress0:1} >=20 > + # or >=20 > + # @Bsf FIELD:{SerialDebugPortAddress0:1b} >=20 > + Match =3D re.match(r"^\s*#\s+(!BSF)\s+FIELD:{(.+)}",= DscLine) >=20 > + if Match: >=20 > + BitFieldTxt =3D Match.group(2) >=20 > + Match =3D re.match("(.+):(\\d+)b([BWDQ])?", BitF= ieldTxt) >=20 > + if not Match: >=20 > + raise Exception("Incorrect bit field \ >=20 > +format '%s' !" % BitFieldTxt) >=20 > + UnitBitLen =3D 1 >=20 > + SubCfgDict =3D ConfigDict.copy() >=20 > + SubCfgDict['cname'] =3D Match.group(1) >=20 > + SubCfgDict['bitlength'] =3D int( >=20 > + Match.group(2)) * UnitBitLen >=20 > + if SubCfgDict['bitlength'] > 0: >=20 > + LastItem =3D self._CfgItemList[-1] >=20 > + if len(LastItem['subreg']) =3D=3D 0: >=20 > + SubOffset =3D 0 >=20 > + else: >=20 > + SubOffset =3D \ >=20 > + LastItem['subreg'][-1]['bitoff= set'] \ >=20 > + + LastItem['subreg'][-1]['bitl= ength'] >=20 > + if Match.group(3) =3D=3D 'B': >=20 > + SubCfgDict['bitunit'] =3D 1 >=20 > + elif Match.group(3) =3D=3D 'W': >=20 > + SubCfgDict['bitunit'] =3D 2 >=20 > + elif Match.group(3) =3D=3D 'Q': >=20 > + SubCfgDict['bitunit'] =3D 8 >=20 > + else: >=20 > + SubCfgDict['bitunit'] =3D 4 >=20 > + SubCfgDict['bitoffset'] =3D SubOffset >=20 > + SubCfgDict['order'] =3D self.GetOrderNumber( >=20 > + SubCfgDict['offset'], SubCfgDict['order'= ], >=20 > + SubOffset) >=20 > + SubCfgDict['value'] =3D '' >=20 > + SubCfgDict['cname'] =3D '%s_%s' % (LastItem[= 'cname'], >=20 > + Match.group= (1)) >=20 > + self.CfgDuplicationCheck(CfgDict, >=20 > + SubCfgDict['cname']= ) >=20 > + LastItem['subreg'].append(SubCfgDict.copy()) >=20 > + Clear =3D True >=20 > + >=20 > + if Clear: >=20 > + ConfigDict['name'] =3D '' >=20 > + ConfigDict['find'] =3D '' >=20 > + ConfigDict['struct'] =3D '' >=20 > + ConfigDict['embed'] =3D '' >=20 > + ConfigDict['marker'] =3D '' >=20 > + ConfigDict['comment'] =3D '' >=20 > + ConfigDict['order'] =3D -1 >=20 > + ConfigDict['subreg'] =3D [] >=20 > + ConfigDict['option'] =3D '' >=20 > + ConfigDict['condition'] =3D '' >=20 > + >=20 > + return Error >=20 > + >=20 > + def GetBsfBitFields(self, subitem, bytes): >=20 > + start =3D subitem['bitoffset'] >=20 > + end =3D start + subitem['bitlength'] >=20 > + bitsvalue =3D ''.join('{0:08b}'.format(i) for i in bytes[::-1]) >=20 > + bitsvalue =3D bitsvalue[::-1] >=20 > + bitslen =3D len(bitsvalue) >=20 > + if start > bitslen or end > bitslen: >=20 > + raise Exception("Invalid bits offset [%d,%d] %d for %s" % >=20 > + (start, end, bitslen, subitem['name'])) >=20 > + return '0x%X' % (int(bitsvalue[start:end][::-1], 2)) >=20 > + >=20 > + def UpdateBsfBitFields(self, SubItem, NewValue, ValueArray): >=20 > + Start =3D SubItem['bitoffset'] >=20 > + End =3D Start + SubItem['bitlength'] >=20 > + Blen =3D len(ValueArray) >=20 > + BitsValue =3D ''.join('{0:08b}'.format(i) for i in ValueArray[::= -1]) >=20 > + BitsValue =3D BitsValue[::-1] >=20 > + BitsLen =3D len(BitsValue) >=20 > + if Start > BitsLen or End > BitsLen: >=20 > + raise Exception("Invalid bits offset [%d,%d] %d for %s" % >=20 > + (Start, End, BitsLen, SubItem['name'])) >=20 > + BitsValue =3D BitsValue[:Start] + '{0:0{1}b}'.format( >=20 > + NewValue, SubItem['bitlength'])[::-1] + BitsValue[End:] >=20 > + ValueArray[:] =3D bytearray.fromhex( >=20 > + '{0:0{1}x}'.format(int(BitsValue[::-1], 2), Blen * 2))[::-1] >=20 > + >=20 > + def CreateVarDict(self): >=20 > + Error =3D 0 >=20 > + self._VarDict =3D {} >=20 > + if len(self._CfgItemList) > 0: >=20 > + Item =3D self._CfgItemList[-1] >=20 > + self._VarDict['_LENGTH_'] =3D '%d' % (Item['offset'] + >=20 > + Item['length']) >=20 > + for Item in self._CfgItemList: >=20 > + Embed =3D Item['embed'] >=20 > + Match =3D re.match("^(\\w+):(\\w+):(START|END)", Embed) >=20 > + if Match: >=20 > + StructName =3D Match.group(1) >=20 > + VarName =3D '_%s_%s_' % (Match.group(3), StructName) >=20 > + if Match.group(3) =3D=3D 'END': >=20 > + self._VarDict[VarName] =3D Item['offset'] + Item['le= ngth'] >=20 > + self._VarDict['_LENGTH_%s_' % StructName] =3D \ >=20 > + self._VarDict['_END_%s_' % StructName] - \ >=20 > + self._VarDict['_START_%s_' % StructName] >=20 > + if Match.group(2).startswith('TAG_'): >=20 > + if (self.Mode !=3D 'FSP') and (self._VarDict >=20 > + ['_LENGTH_%s_' % >=20 > + StructName] % 4): >=20 > + raise Exception("Size of structure '%s' is %= d, \ >=20 > +not DWORD aligned !" % (StructName, self._VarDict['_LENGTH_%s_' % > StructName])) >=20 > + self._VarDict['_TAG_%s_' % StructName] =3D int( >=20 > + Match.group(2)[4:], 16) & 0xFFF >=20 > + else: >=20 > + self._VarDict[VarName] =3D Item['offset'] >=20 > + if Item['marker']: >=20 > + self._VarDict['_OFFSET_%s_' % Item['marker'].strip()] = =3D \ >=20 > + Item['offset'] >=20 > + return Error >=20 > + >=20 > + def UpdateBsfBitUnit(self, Item): >=20 > + BitTotal =3D 0 >=20 > + BitOffset =3D 0 >=20 > + StartIdx =3D 0 >=20 > + Unit =3D None >=20 > + UnitDec =3D {1: 'BYTE', 2: 'WORD', 4: 'DWORD', 8: 'QWORD'} >=20 > + for Idx, SubItem in enumerate(Item['subreg']): >=20 > + if Unit is None: >=20 > + Unit =3D SubItem['bitunit'] >=20 > + BitLength =3D SubItem['bitlength'] >=20 > + BitTotal +=3D BitLength >=20 > + BitOffset +=3D BitLength >=20 > + >=20 > + if BitOffset > 64 or BitOffset > Unit * 8: >=20 > + break >=20 > + >=20 > + if BitOffset =3D=3D Unit * 8: >=20 > + for SubIdx in range(StartIdx, Idx + 1): >=20 > + Item['subreg'][SubIdx]['bitunit'] =3D Unit >=20 > + BitOffset =3D 0 >=20 > + StartIdx =3D Idx + 1 >=20 > + Unit =3D None >=20 > + >=20 > + if BitOffset > 0: >=20 > + raise Exception("Bit fields cannot fit into %s for \ >=20 > +'%s.%s' !" % (UnitDec[Unit], Item['cname'], SubItem['cname'])) >=20 > + >=20 > + ExpectedTotal =3D Item['length'] * 8 >=20 > + if Item['length'] * 8 !=3D BitTotal: >=20 > + raise Exception("Bit fields total length (%d) does not match= \ >=20 > +length (%d) of '%s' !" % (BitTotal, ExpectedTotal, Item['cname'])) >=20 > + >=20 > + def UpdateDefaultValue(self): >=20 > + Error =3D 0 >=20 > + for Idx, Item in enumerate(self._CfgItemList): >=20 > + if len(Item['subreg']) =3D=3D 0: >=20 > + Value =3D Item['value'] >=20 > + if (len(Value) > 0) and (Value[0] =3D=3D '{' or Value[0]= =3D=3D "'" or >=20 > + Value[0] =3D=3D '"'): >=20 > + # {XXX} or 'XXX' strings >=20 > + self.FormatListValue(self._CfgItemList[Idx]) >=20 > + else: >=20 > + Match =3D re.match("(0x[0-9a-fA-F]+|[0-9]+)", Value) >=20 > + if not Match: >=20 > + NumValue =3D self.EvaluateExpress(Value) >=20 > + Item['value'] =3D '0x%X' % NumValue >=20 > + else: >=20 > + ValArray =3D self.ValueToByteArray(Item['value'], Item['= length']) >=20 > + for SubItem in Item['subreg']: >=20 > + SubItem['value'] =3D self.GetBsfBitFields(SubItem, V= alArray) >=20 > + self.UpdateBsfBitUnit(Item) >=20 > + return Error >=20 > + >=20 > + @staticmethod >=20 > + def ExpandIncludeFiles(FilePath, CurDir=3D''): >=20 > + if CurDir =3D=3D '': >=20 > + CurDir =3D os.path.dirname(FilePath) >=20 > + FilePath =3D os.path.basename(FilePath) >=20 > + >=20 > + InputFilePath =3D os.path.join(CurDir, FilePath) >=20 > + File =3D open(InputFilePath, "r") >=20 > + Lines =3D File.readlines() >=20 > + File.close() >=20 > + >=20 > + NewLines =3D [] >=20 > + for LineNum, Line in enumerate(Lines): >=20 > + Match =3D re.match("^!include\\s*(.+)?$", Line) >=20 > + if Match: >=20 > + IncPath =3D Match.group(1) >=20 > + TmpPath =3D os.path.join(CurDir, IncPath) >=20 > + OrgPath =3D TmpPath >=20 > + if not os.path.exists(TmpPath): >=20 > + CurDir =3D os.path.join(os.path.dirname( >=20 > + os.path.realpath(__file__)), "..", "..") >=20 > + TmpPath =3D os.path.join(CurDir, IncPath) >=20 > + if not os.path.exists(TmpPath): >=20 > + raise Exception("ERROR: Cannot open include file '%s= '." % >=20 > + OrgPath) >=20 > + else: >=20 > + NewLines.append(('# Included from file: %s\n' % >=20 > + IncPath, TmpPath, 0)) >=20 > + NewLines.append(('# %s\n' % ('=3D' * 80), TmpPath, 0= )) >=20 > + NewLines.extend(CGenCfgData.ExpandIncludeFiles >=20 > + (IncPath, CurDir)) >=20 > + else: >=20 > + NewLines.append((Line, InputFilePath, LineNum)) >=20 > + >=20 > + return NewLines >=20 > + >=20 > + def OverrideDefaultValue(self, DltFile): >=20 > + Error =3D 0 >=20 > + DltLines =3D CGenCfgData.ExpandIncludeFiles(DltFile) >=20 > + >=20 > + PlatformId =3D None >=20 > + for Line, FilePath, LineNum in DltLines: >=20 > + Line =3D Line.strip() >=20 > + if not Line or Line.startswith('#'): >=20 > + continue >=20 > + Match =3D re.match("\\s*(\\w+)\\.(\\w+)(\\.\\w+)?\\s*\\|\\s*= (.+)", >=20 > + Line) >=20 > + if not Match: >=20 > + raise Exception("Unrecognized line '%s' (File:'%s' Line:= %d) !" >=20 > + % (Line, FilePath, LineNum + 1)) >=20 > + >=20 > + Found =3D False >=20 > + InScope =3D False >=20 > + for Idx, Item in enumerate(self._CfgItemList): >=20 > + if not InScope: >=20 > + if not (Item['embed'].endswith(':START') and >=20 > + Item['embed'].startswith(Match.group(1))): >=20 > + continue >=20 > + InScope =3D True >=20 > + if Item['cname'] =3D=3D Match.group(2): >=20 > + Found =3D True >=20 > + break >=20 > + if Item['embed'].endswith(':END') and \ >=20 > + Item['embed'].startswith(Match.group(1)): >=20 > + break >=20 > + Name =3D '%s.%s' % (Match.group(1), Match.group(2)) >=20 > + if not Found: >=20 > + ErrItem =3D Match.group(2) if InScope else Match.group(1= ) >=20 > + raise Exception("Invalid configuration '%s' in '%s' \ >=20 > +(File:'%s' Line:%d) !" % (ErrItem, Name, FilePath, LineNum + 1)) >=20 > + >=20 > + ValueStr =3D Match.group(4).strip() >=20 > + if Match.group(3) is not None: >=20 > + # This is a subregion item >=20 > + BitField =3D Match.group(3)[1:] >=20 > + Found =3D False >=20 > + if len(Item['subreg']) > 0: >=20 > + for SubItem in Item['subreg']: >=20 > + if SubItem['cname'] =3D=3D '%s_%s' % \ >=20 > + (Item['cname'], BitField): >=20 > + Found =3D True >=20 > + break >=20 > + if not Found: >=20 > + raise Exception("Invalid configuration bit field \ >=20 > +'%s' in '%s.%s' (File:'%s' Line:%d) !" % (BitField, Name, BitField, >=20 > + FilePath, LineNum + 1)) >=20 > + >=20 > + try: >=20 > + Value =3D int(ValueStr, 16) if ValueStr.startswith('= 0x') \ >=20 > + else int(ValueStr, 10) >=20 > + except Exception: >=20 > + raise Exception("Invalid value '%s' for bit field '%= s.%s' \ >=20 > +(File:'%s' Line:%d) !" % (ValueStr, Name, BitField, FilePath, LineNum + = 1)) >=20 > + >=20 > + if Value >=3D 2 ** SubItem['bitlength']: >=20 > + raise Exception("Invalid configuration bit field val= ue \ >=20 > +'%s' for '%s.%s' (File:'%s' Line:%d) !" % (Value, Name, BitField, >=20 > + FilePath, LineNum + 1)) >=20 > + >=20 > + ValArray =3D self.ValueToByteArray(Item['value'], Item['= length']) >=20 > + self.UpdateBsfBitFields(SubItem, Value, ValArray) >=20 > + >=20 > + if Item['value'].startswith('{'): >=20 > + Item['value'] =3D '{' + ', '.join('0x%02X' % i >=20 > + for i in ValArray) += '}' >=20 > + else: >=20 > + BitsValue =3D ''.join('{0:08b}'.format(i) >=20 > + for i in ValArray[::-1]) >=20 > + Item['value'] =3D '0x%X' % (int(BitsValue, 2)) >=20 > + else: >=20 > + if Item['value'].startswith('{') and \ >=20 > + not ValueStr.startswith('{'): >=20 > + raise Exception("Data array required for '%s' \ >=20 > +(File:'%s' Line:%d) !" % (Name, FilePath, LineNum + 1)) >=20 > + Item['value'] =3D ValueStr >=20 > + >=20 > + if Name =3D=3D 'PLATFORMID_CFG_DATA.PlatformId': >=20 > + PlatformId =3D ValueStr >=20 > + >=20 > + if (PlatformId is None) and (self.Mode !=3D 'FSP'): >=20 > + raise Exception("PLATFORMID_CFG_DATA.PlatformId is missi= ng \ >=20 > +in file '%s' !" % (DltFile)) >=20 > + >=20 > + return Error >=20 > + >=20 > + def ProcessMultilines(self, String, MaxCharLength): >=20 > + Multilines =3D '' >=20 > + StringLength =3D len(String) >=20 > + CurrentStringStart =3D 0 >=20 > + StringOffset =3D 0 >=20 > + BreakLineDict =3D [] >=20 > + if len(String) <=3D MaxCharLength: >=20 > + while (StringOffset < StringLength): >=20 > + if StringOffset >=3D 1: >=20 > + if String[StringOffset - 1] =3D=3D '\\' and \ >=20 > + String[StringOffset] =3D=3D 'n': >=20 > + BreakLineDict.append(StringOffset + 1) >=20 > + StringOffset +=3D 1 >=20 > + if BreakLineDict !=3D []: >=20 > + for Each in BreakLineDict: >=20 > + Multilines +=3D " %s\n" % String[CurrentStringStart= :Each].\ >=20 > + lstrip() >=20 > + CurrentStringStart =3D Each >=20 > + if StringLength - CurrentStringStart > 0: >=20 > + Multilines +=3D " %s\n" % String[CurrentStringStart= :].\ >=20 > + lstrip() >=20 > + else: >=20 > + Multilines =3D " %s\n" % String >=20 > + else: >=20 > + NewLineStart =3D 0 >=20 > + NewLineCount =3D 0 >=20 > + FoundSpaceChar =3D False >=20 > + while(StringOffset < StringLength): >=20 > + if StringOffset >=3D 1: >=20 > + if NewLineCount >=3D MaxCharLength - 1: >=20 > + if String[StringOffset] =3D=3D ' ' and \ >=20 > + StringLength - StringOffset > 10: >=20 > + BreakLineDict.append(NewLineStart + NewLineC= ount) >=20 > + NewLineStart =3D NewLineStart + NewLineCount >=20 > + NewLineCount =3D 0 >=20 > + FoundSpaceChar =3D True >=20 > + elif StringOffset =3D=3D StringLength - 1 \ >=20 > + and FoundSpaceChar is False: >=20 > + BreakLineDict.append(0) >=20 > + if String[StringOffset - 1] =3D=3D '\\' and \ >=20 > + String[StringOffset] =3D=3D 'n': >=20 > + BreakLineDict.append(StringOffset + 1) >=20 > + NewLineStart =3D StringOffset + 1 >=20 > + NewLineCount =3D 0 >=20 > + StringOffset +=3D 1 >=20 > + NewLineCount +=3D 1 >=20 > + if BreakLineDict !=3D []: >=20 > + BreakLineDict.sort() >=20 > + for Each in BreakLineDict: >=20 > + if Each > 0: >=20 > + Multilines +=3D " %s\n" % String[ >=20 > + CurrentStringStart:Each].lstrip() >=20 > + CurrentStringStart =3D Each >=20 > + if StringLength - CurrentStringStart > 0: >=20 > + Multilines +=3D " %s\n" % String[CurrentStringStart= :].\ >=20 > + lstrip() >=20 > + return Multilines >=20 > + >=20 > + def CreateField(self, Item, Name, Length, Offset, Struct, >=20 > + BsfName, Help, Option, BitsLength=3DNone): >=20 > + PosName =3D 28 >=20 > + NameLine =3D '' >=20 > + HelpLine =3D '' >=20 > + OptionLine =3D '' >=20 > + >=20 > + if Length =3D=3D 0 and Name =3D=3D 'Dummy': >=20 > + return '\n' >=20 > + >=20 > + IsArray =3D False >=20 > + if Length in [1, 2, 4, 8]: >=20 > + Type =3D "UINT%d" % (Length * 8) >=20 > + else: >=20 > + IsArray =3D True >=20 > + Type =3D "UINT8" >=20 > + >=20 > + if Item and Item['value'].startswith('{'): >=20 > + Type =3D "UINT8" >=20 > + IsArray =3D True >=20 > + >=20 > + if Struct !=3D '': >=20 > + Type =3D Struct >=20 > + if Struct in ['UINT8', 'UINT16', 'UINT32', 'UINT64']: >=20 > + IsArray =3D True >=20 > + Unit =3D int(Type[4:]) // 8 >=20 > + Length =3D Length / Unit >=20 > + else: >=20 > + IsArray =3D False >=20 > + >=20 > + if IsArray: >=20 > + Name =3D Name + '[%d]' % Length >=20 > + >=20 > + if len(Type) < PosName: >=20 > + Space1 =3D PosName - len(Type) >=20 > + else: >=20 > + Space1 =3D 1 >=20 > + >=20 > + if BsfName !=3D '': >=20 > + NameLine =3D " %s\n" % BsfName >=20 > + else: >=20 > + NameLine =3D "\n" >=20 > + >=20 > + if Help !=3D '': >=20 > + HelpLine =3D self.ProcessMultilines(Help, 80) >=20 > + >=20 > + if Option !=3D '': >=20 > + OptionLine =3D self.ProcessMultilines(Option, 80) >=20 > + >=20 > + if BitsLength is None: >=20 > + BitsLength =3D '' >=20 > + else: >=20 > + BitsLength =3D ' : %d' % BitsLength >=20 > + >=20 > + return "\n/** %s%s%s**/\n %s%s%s%s;\n" % \ >=20 > + (NameLine, HelpLine, OptionLine, Type, ' ' * Space1, Name= , >=20 > + BitsLength) >=20 > + >=20 > + def SplitTextBody(self, TextBody): >=20 > + Marker1 =3D '{ /* _COMMON_STRUCT_START_ */' >=20 > + Marker2 =3D '; /* _COMMON_STRUCT_END_ */' >=20 > + ComBody =3D [] >=20 > + TxtBody =3D [] >=20 > + IsCommon =3D False >=20 > + for Line in TextBody: >=20 > + if Line.strip().endswith(Marker1): >=20 > + Line =3D Line.replace(Marker1[1:], '') >=20 > + IsCommon =3D True >=20 > + if Line.strip().endswith(Marker2): >=20 > + Line =3D Line.replace(Marker2[1:], '') >=20 > + if IsCommon: >=20 > + ComBody.append(Line) >=20 > + IsCommon =3D False >=20 > + continue >=20 > + if IsCommon: >=20 > + ComBody.append(Line) >=20 > + else: >=20 > + TxtBody.append(Line) >=20 > + return ComBody, TxtBody >=20 > + >=20 > + def GetStructArrayInfo(self, Input): >=20 > + ArrayStr =3D Input.split('[') >=20 > + Name =3D ArrayStr[0] >=20 > + if len(ArrayStr) > 1: >=20 > + NumStr =3D ''.join(c for c in ArrayStr[-1] if c.isdigit()) >=20 > + NumStr =3D '1000' if len(NumStr) =3D=3D 0 else NumStr >=20 > + ArrayNum =3D int(NumStr) >=20 > + else: >=20 > + ArrayNum =3D 0 >=20 > + return Name, ArrayNum >=20 > + >=20 > + def PostProcessBody(self, TextBody, IncludeEmbedOnly=3DTrue): >=20 > + NewTextBody =3D [] >=20 > + OldTextBody =3D [] >=20 > + IncTextBody =3D [] >=20 > + StructBody =3D [] >=20 > + IncludeLine =3D False >=20 > + EmbedFound =3D False >=20 > + StructName =3D '' >=20 > + ArrayVarName =3D '' >=20 > + VariableName =3D '' >=20 > + Count =3D 0 >=20 > + Level =3D 0 >=20 > + IsCommonStruct =3D False >=20 > + >=20 > + for Line in TextBody: >=20 > + if Line.startswith('#define '): >=20 > + IncTextBody.append(Line) >=20 > + continue >=20 > + >=20 > + if not Line.startswith('/* EMBED_STRUCT:'): >=20 > + Match =3D False >=20 > + else: >=20 > + Match =3D re.match("^/\\*\\sEMBED_STRUCT:([\\w\\[\\]\\*]= +):\ >=20 > +([\\w\\[\\]\\*]+):(\\w+):(START|END)([\\s\\d]+)\\*/([\\s\\S]*)", Line) >=20 > + >=20 > + if Match: >=20 > + ArrayMarker =3D Match.group(5) >=20 > + if Match.group(4) =3D=3D 'END': >=20 > + Level -=3D 1 >=20 > + if Level =3D=3D 0: >=20 > + Line =3D Match.group(6) >=20 > + else: # 'START' >=20 > + Level +=3D 1 >=20 > + if Level =3D=3D 1: >=20 > + Line =3D Match.group(6) >=20 > + else: >=20 > + EmbedFound =3D True >=20 > + TagStr =3D Match.group(3) >=20 > + if TagStr.startswith('TAG_'): >=20 > + try: >=20 > + TagVal =3D int(TagStr[4:], 16) >=20 > + except Exception: >=20 > + TagVal =3D -1 >=20 > + if (TagVal >=3D 0) and (TagVal < self._MinCfgTag= Id): >=20 > + IsCommonStruct =3D True >=20 > + >=20 > + if Level =3D=3D 1: >=20 > + if IsCommonStruct: >=20 > + Suffix =3D ' /* _COMMON_STRUCT_START_ */' >=20 > + else: >=20 > + Suffix =3D '' >=20 > + StructBody =3D ['typedef struct {%s' % Suffix] >=20 > + StructName =3D Match.group(1) >=20 > + StructType =3D Match.group(2) >=20 > + VariableName =3D Match.group(3) >=20 > + MatchOffset =3D re.search('/\\*\\*\\sOffset\\s0x= \ >=20 > +([a-fA-F0-9]+)', Line) >=20 > + if MatchOffset: >=20 > + Offset =3D int(MatchOffset.group(1), 16) >=20 > + else: >=20 > + Offset =3D None >=20 > + IncludeLine =3D True >=20 > + >=20 > + ModifiedStructType =3D StructType.rstrip() >=20 > + if ModifiedStructType.endswith(']'): >=20 > + Idx =3D ModifiedStructType.index('[') >=20 > + if ArrayMarker !=3D ' ': >=20 > + # Auto array size >=20 > + OldTextBody.append('') >=20 > + ArrayVarName =3D VariableName >=20 > + if int(ArrayMarker) =3D=3D 1000: >=20 > + Count =3D 1 >=20 > + else: >=20 > + Count =3D int(ArrayMarker) + 1000 >=20 > + else: >=20 > + if Count < 1000: >=20 > + Count +=3D 1 >=20 > + >=20 > + VariableTemp =3D ArrayVarName + '[%d]' % ( >=20 > + Count if Count < 1000 else Count - 1000) >=20 > + OldTextBody[-1] =3D self.CreateField( >=20 > + None, VariableTemp, 0, Offset, >=20 > + ModifiedStructType[:Idx], '', >=20 > + 'Structure Array', '') >=20 > + else: >=20 > + ArrayVarName =3D '' >=20 > + OldTextBody.append(self.CreateField( >=20 > + None, VariableName, 0, Offset, >=20 > + ModifiedStructType, '', '', '')) >=20 > + >=20 > + if IncludeLine: >=20 > + StructBody.append(Line) >=20 > + else: >=20 > + OldTextBody.append(Line) >=20 > + >=20 > + if Match and Match.group(4) =3D=3D 'END': >=20 > + if Level =3D=3D 0: >=20 > + if (StructType !=3D Match.group(2)) or \ >=20 > + (VariableName !=3D Match.group(3)): >=20 > + print("Unmatched struct name '%s' and '%s' !" % >=20 > + (StructName, Match.group(2))) >=20 > + else: >=20 > + if IsCommonStruct: >=20 > + Suffix =3D ' /* _COMMON_STRUCT_END_ */' >=20 > + else: >=20 > + Suffix =3D '' >=20 > + Line =3D '} %s;%s\n\n\n' % (StructName, Suffix) >=20 > + StructBody.append(Line) >=20 > + if (Line not in NewTextBody) and \ >=20 > + (Line not in OldTextBody): >=20 > + NewTextBody.extend(StructBody) >=20 > + IncludeLine =3D False >=20 > + IsCommonStruct =3D False >=20 > + >=20 > + if not IncludeEmbedOnly: >=20 > + NewTextBody.extend(OldTextBody) >=20 > + >=20 > + if EmbedFound: >=20 > + NewTextBody =3D self.PostProcessBody(NewTextBody, False) >=20 > + >=20 > + NewTextBody =3D IncTextBody + NewTextBody >=20 > + return NewTextBody >=20 > + >=20 > + def WriteHeaderFile(self, TxtBody, FileName, Type=3D'h'): >=20 > + FileNameDef =3D os.path.basename(FileName).replace('.', '_') >=20 > + FileNameDef =3D re.sub('(.)([A-Z][a-z]+)', r'\1_\2', FileNameDef= ) >=20 > + FileNameDef =3D re.sub('([a-z0-9])([A-Z])', r'\1_\2', >=20 > + FileNameDef).upper() >=20 > + >=20 > + Lines =3D [] >=20 > + Lines.append("%s\n" % GetCopyrightHeader(Type)) >=20 > + Lines.append("#ifndef __%s__\n" % FileNameDef) >=20 > + Lines.append("#define __%s__\n\n" % FileNameDef) >=20 > + if Type =3D=3D 'h': >=20 > + Lines.append("#pragma pack(1)\n\n") >=20 > + Lines.extend(TxtBody) >=20 > + if Type =3D=3D 'h': >=20 > + Lines.append("#pragma pack()\n\n") >=20 > + Lines.append("#endif\n") >=20 > + >=20 > + # Don't rewrite if the contents are the same >=20 > + Create =3D True >=20 > + if os.path.exists(FileName): >=20 > + HdrFile =3D open(FileName, "r") >=20 > + OrgTxt =3D HdrFile.read() >=20 > + HdrFile.close() >=20 > + >=20 > + NewTxt =3D ''.join(Lines) >=20 > + if OrgTxt =3D=3D NewTxt: >=20 > + Create =3D False >=20 > + >=20 > + if Create: >=20 > + HdrFile =3D open(FileName, "w") >=20 > + HdrFile.write(''.join(Lines)) >=20 > + HdrFile.close() >=20 > + >=20 > + def CreateHeaderFile(self, HdrFileName, ComHdrFileName=3D''): >=20 > + LastStruct =3D '' >=20 > + SpaceIdx =3D 0 >=20 > + Offset =3D 0 >=20 > + FieldIdx =3D 0 >=20 > + LastFieldIdx =3D 0 >=20 > + ResvOffset =3D 0 >=20 > + ResvIdx =3D 0 >=20 > + TxtBody =3D [] >=20 > + LineBuffer =3D [] >=20 > + CfgTags =3D [] >=20 > + LastVisible =3D True >=20 > + >=20 > + TxtBody.append("typedef struct {\n") >=20 > + for Item in self._CfgItemList: >=20 > + # Search for CFGDATA tags >=20 > + Embed =3D Item["embed"].upper() >=20 > + if Embed.endswith(':START'): >=20 > + Match =3D re.match(r'(\w+)_CFG_DATA:TAG_([0-9A-F]+):STAR= T', >=20 > + Embed) >=20 > + if Match: >=20 > + TagName =3D Match.group(1) >=20 > + TagId =3D int(Match.group(2), 16) >=20 > + CfgTags.append((TagId, TagName)) >=20 > + >=20 > + # Only process visible items >=20 > + NextVisible =3D LastVisible >=20 > + >=20 > + if LastVisible and (Item['header'] =3D=3D 'OFF'): >=20 > + NextVisible =3D False >=20 > + ResvOffset =3D Item['offset'] >=20 > + elif (not LastVisible) and Item['header'] =3D=3D 'ON': >=20 > + NextVisible =3D True >=20 > + Name =3D "ReservedUpdSpace%d" % ResvIdx >=20 > + ResvIdx =3D ResvIdx + 1 >=20 > + TxtBody.append(self.CreateField( >=20 > + Item, Name, Item["offset"] - ResvOffset, >=20 > + ResvOffset, '', '', '', '')) >=20 > + FieldIdx +=3D 1 >=20 > + >=20 > + if Offset < Item["offset"]: >=20 > + if LastVisible: >=20 > + Name =3D "UnusedUpdSpace%d" % SpaceIdx >=20 > + LineBuffer.append(self.CreateField >=20 > + (Item, Name, Item["offset"] - >=20 > + Offset, Offset, '', '', '', '')) >=20 > + FieldIdx +=3D 1 >=20 > + SpaceIdx =3D SpaceIdx + 1 >=20 > + Offset =3D Item["offset"] >=20 > + >=20 > + LastVisible =3D NextVisible >=20 > + >=20 > + Offset =3D Offset + Item["length"] >=20 > + if LastVisible: >=20 > + for Each in LineBuffer: >=20 > + TxtBody.append(Each) >=20 > + LineBuffer =3D [] >=20 > + Embed =3D Item["embed"].upper() >=20 > + if Embed.endswith(':START') or Embed.endswith(':END'): >=20 > + # EMBED_STRUCT: StructName : \ >=20 > + # ItemName : VariableName : START|END >=20 > + Name, ArrayNum =3D self.GetStructArrayInfo(Item["str= uct"]) >=20 > + Remaining =3D Item["embed"] >=20 > + if (LastFieldIdx + 1 =3D=3D FieldIdx) and (LastStruc= t =3D=3D Name): >=20 > + ArrayMarker =3D ' ' >=20 > + else: >=20 > + ArrayMarker =3D '%d' % ArrayNum >=20 > + LastFieldIdx =3D FieldIdx >=20 > + LastStruct =3D Name >=20 > + Marker =3D '/* EMBED_STRUCT:%s:%s%s*/ ' % (Name, Rem= aining, >=20 > + ArrayMarker= ) >=20 > + # if Embed.endswith(':START') and Comment !=3D '': >=20 > + # Marker =3D '/* COMMENT:%s */ \n' % Item["comment"]= + Marker >=20 > + else: >=20 > + if Embed =3D=3D '': >=20 > + Marker =3D '' >=20 > + else: >=20 > + self.Error =3D "Invalid embedded structure \ >=20 > +format '%s'!\n" % Item["embed"] >=20 > + return 4 >=20 > + >=20 > + # Generate bit fields for structure >=20 > + if len(Item['subreg']) > 0 and Item["struct"]: >=20 > + StructType =3D Item["struct"] >=20 > + StructName, ArrayNum =3D self.GetStructArrayInfo(Str= uctType) >=20 > + if (LastFieldIdx + 1 =3D=3D FieldIdx) and \ >=20 > + (LastStruct =3D=3D Item["struct"]): >=20 > + ArrayMarker =3D ' ' >=20 > + else: >=20 > + ArrayMarker =3D '%d' % ArrayNum >=20 > + TxtBody.append('/* EMBED_STRUCT:%s:%s:%s:START%s*/\n= ' % >=20 > + (StructName, StructType, Item["cname"= ], >=20 > + ArrayMarker)) >=20 > + for SubItem in Item['subreg']: >=20 > + Name =3D SubItem["cname"] >=20 > + if Name.startswith(Item["cname"]): >=20 > + Name =3D Name[len(Item["cname"]) + 1:] >=20 > + Line =3D self.CreateField( >=20 > + SubItem, Name, SubItem["bitunit"], >=20 > + SubItem["offset"], SubItem['struct'], >=20 > + SubItem['name'], SubItem['help'], >=20 > + SubItem['option'], SubItem['bitlength']) >=20 > + TxtBody.append(Line) >=20 > + TxtBody.append('/* EMBED_STRUCT:%s:%s:%s:END%s*/\n' = % >=20 > + (StructName, StructType, Item["cname"= ], >=20 > + ArrayMarker)) >=20 > + LastFieldIdx =3D FieldIdx >=20 > + LastStruct =3D Item["struct"] >=20 > + FieldIdx +=3D 1 >=20 > + else: >=20 > + FieldIdx +=3D 1 >=20 > + Line =3D Marker + self.CreateField( >=20 > + Item, Item["cname"], Item["length"], Item["offse= t"], >=20 > + Item['struct'], Item['name'], Item['help'], >=20 > + Item['option']) >=20 > + TxtBody.append(Line) >=20 > + >=20 > + TxtBody.append("}\n\n") >=20 > + >=20 > + # Handle the embedded data structure >=20 > + TxtBody =3D self.PostProcessBody(TxtBody) >=20 > + ComBody, TxtBody =3D self.SplitTextBody(TxtBody) >=20 > + >=20 > + # Prepare TAG defines >=20 > + PltTagDefTxt =3D ['\n'] >=20 > + ComTagDefTxt =3D ['\n'] >=20 > + for TagId, TagName in sorted(CfgTags): >=20 > + TagLine =3D '#define %-30s 0x%03X\n' % ('CDATA_%s_TAG' % >=20 > + TagName, TagId) >=20 > + if TagId < self._MinCfgTagId: >=20 > + # TAG ID < 0x100, it is a generic TAG >=20 > + ComTagDefTxt.append(TagLine) >=20 > + else: >=20 > + PltTagDefTxt.append(TagLine) >=20 > + PltTagDefTxt.append('\n\n') >=20 > + ComTagDefTxt.append('\n\n') >=20 > + >=20 > + # Write file back >=20 > + self.WriteHeaderFile(PltTagDefTxt + TxtBody, HdrFileName) >=20 > + if ComHdrFileName: >=20 > + self.WriteHeaderFile(ComTagDefTxt + ComBody, ComHdrFileName) >=20 > + >=20 > + return 0 >=20 > + >=20 > + def UpdateConfigItemValue(self, Item, ValueStr): >=20 > + IsArray =3D True if Item['value'].startswith('{') else False >=20 > + IsString =3D True if Item['value'].startswith("'") else False >=20 > + Bytes =3D self.ValueToByteArray(ValueStr, Item['length']) >=20 > + if IsString: >=20 > + NewValue =3D "'%s'" % Bytes.decode("utf-8") >=20 > + elif IsArray: >=20 > + NewValue =3D Bytes2Str(Bytes) >=20 > + else: >=20 > + Fmt =3D '0x%X' if Item['value'].startswith('0x') else '%d' >=20 > + NewValue =3D Fmt % Bytes2Val(Bytes) >=20 > + Item['value'] =3D NewValue >=20 > + >=20 > + def LoadDefaultFromBinaryArray(self, BinDat, IgnoreFind=3DFalse): >=20 > + FindOff =3D 0 >=20 > + StartOff =3D 0 >=20 > + for Item in self._CfgItemList: >=20 > + if Item['length'] =3D=3D 0: >=20 > + continue >=20 > + if not IgnoreFind and Item['find']: >=20 > + FindBin =3D Item['find'].encode() >=20 > + Offset =3D BinDat.find(FindBin) >=20 > + if Offset >=3D 0: >=20 > + TestOff =3D BinDat[Offset+len(FindBin):].find(FindBi= n) >=20 > + if TestOff >=3D 0: >=20 > + raise Exception('Multiple match found for "%s" != ' % >=20 > + Item['find']) >=20 > + FindOff =3D Offset + len(FindBin) >=20 > + StartOff =3D Item['offset'] >=20 > + else: >=20 > + raise Exception('Could not find "%s" !' % Item['find= ']) >=20 > + if Item['offset'] + Item['length'] > len(BinDat): >=20 > + raise Exception('Mismatching format between DSC \ >=20 > +and BIN files !') >=20 > + Offset =3D FindOff + (Item['offset'] - StartOff) >=20 > + ValStr =3D Bytes2Str(BinDat[Offset: Offset + Item['length']]= ) >=20 > + self.UpdateConfigItemValue(Item, ValStr) >=20 > + >=20 > + self.UpdateDefaultValue() >=20 > + >=20 > + def PatchBinaryArray(self, BinDat): >=20 > + FileOff =3D 0 >=20 > + Offset =3D 0 >=20 > + FindOff =3D 0 >=20 > + >=20 > + PatchList =3D [] >=20 > + CfgBin =3D bytearray() >=20 > + for Item in self._CfgItemList: >=20 > + if Item['length'] =3D=3D 0: >=20 > + continue >=20 > + >=20 > + if Item['find']: >=20 > + if len(CfgBin) > 0: >=20 > + PatchList.append((FileOff, CfgBin)) >=20 > + FindBin =3D Item['find'].encode() >=20 > + FileOff =3D BinDat.find(FindBin) >=20 > + if FileOff < 0: >=20 > + raise Exception('Could not find "%s" !' % Item['find= ']) >=20 > + else: >=20 > + TestOff =3D BinDat[FileOff+len(FindBin):].find(FindB= in) >=20 > + if TestOff >=3D 0: >=20 > + raise Exception('Multiple match found for "%s" != ' % >=20 > + Item['find']) >=20 > + FileOff +=3D len(FindBin) >=20 > + Offset =3D Item['offset'] >=20 > + FindOff =3D Offset >=20 > + CfgBin =3D bytearray() >=20 > + >=20 > + if Item['offset'] > Offset: >=20 > + Gap =3D Item['offset'] - Offset >=20 > + CfgBin.extend(b'\x00' * Gap) >=20 > + >=20 > + if Item['type'] =3D=3D 'Reserved' and Item['option'] =3D=3D = '$SKIP': >=20 > + # keep old data >=20 > + NewOff =3D FileOff + (Offset - FindOff) >=20 > + FileData =3D bytearray(BinDat[NewOff: NewOff + Item['len= gth']]) >=20 > + CfgBin.extend(FileData) >=20 > + else: >=20 > + CfgBin.extend(self.ValueToByteArray(Item['value'], >=20 > + Item['length'])) >=20 > + Offset =3D Item['offset'] + Item['length'] >=20 > + >=20 > + if len(CfgBin) > 0: >=20 > + PatchList.append((FileOff, CfgBin)) >=20 > + >=20 > + for FileOff, CfgBin in PatchList: >=20 > + Length =3D len(CfgBin) >=20 > + if FileOff + Length < len(BinDat): >=20 > + BinDat[FileOff:FileOff+Length] =3D CfgBin[:] >=20 > + >=20 > + return BinDat >=20 > + >=20 > + def GenerateBinaryArray(self): >=20 > + Offset =3D 0 >=20 > + BinDat =3D bytearray() >=20 > + for Item in self._CfgItemList: >=20 > + if Item['offset'] > Offset: >=20 > + Gap =3D Item['offset'] - Offset >=20 > + BinDat.extend(b'\x00' * Gap) >=20 > + BinDat.extend(self.ValueToByteArray(Item['value'], Item['len= gth'])) >=20 > + Offset =3D Item['offset'] + Item['length'] >=20 > + return BinDat >=20 > + >=20 > + def GenerateBinary(self, BinFileName): >=20 > + BinFile =3D open(BinFileName, "wb") >=20 > + BinFile.write(self.GenerateBinaryArray()) >=20 > + BinFile.close() >=20 > + return 0 >=20 > + >=20 > + def GenerateDataIncFile(self, DatIncFileName, BinFile=3DNone): >=20 > + # Put a prefix GUID before CFGDATA so that it can be located lat= er on >=20 > + Prefix =3D b'\xa7\xbd\x7f\x73\x20\x1e\x46\xd6\xbe\x8f\ >=20 > +x64\x12\x05\x8d\x0a\xa8' >=20 > + if BinFile: >=20 > + Fin =3D open(BinFile, 'rb') >=20 > + BinDat =3D Prefix + bytearray(Fin.read()) >=20 > + Fin.close() >=20 > + else: >=20 > + BinDat =3D Prefix + self.GenerateBinaryArray() >=20 > + >=20 > + FileName =3D os.path.basename(DatIncFileName).upper() >=20 > + FileName =3D FileName.replace('.', '_') >=20 > + >=20 > + TxtLines =3D [] >=20 > + >=20 > + TxtLines.append("UINT8 mConfigDataBlob[%d] =3D {\n" % len(BinDa= t)) >=20 > + Count =3D 0 >=20 > + Line =3D [' '] >=20 > + for Each in BinDat: >=20 > + Line.append('0x%02X, ' % Each) >=20 > + Count =3D Count + 1 >=20 > + if (Count & 0x0F) =3D=3D 0: >=20 > + Line.append('\n') >=20 > + TxtLines.append(''.join(Line)) >=20 > + Line =3D [' '] >=20 > + if len(Line) > 1: >=20 > + TxtLines.append(''.join(Line) + '\n') >=20 > + >=20 > + TxtLines.append("};\n\n") >=20 > + >=20 > + self.WriteHeaderFile(TxtLines, DatIncFileName, 'inc') >=20 > + >=20 > + return 0 >=20 > + >=20 > + def CheckCfgData(self): >=20 > + # Check if CfgData contains any duplicated name >=20 > + def AddItem(Item, ChkList): >=20 > + Name =3D Item['cname'] >=20 > + if Name in ChkList: >=20 > + return Item >=20 > + if Name not in ['Dummy', 'Reserved', 'CfgHeader', 'CondValue= ']: >=20 > + ChkList.append(Name) >=20 > + return None >=20 > + >=20 > + Duplicate =3D None >=20 > + ChkList =3D [] >=20 > + for Item in self._CfgItemList: >=20 > + Duplicate =3D AddItem(Item, ChkList) >=20 > + if not Duplicate: >=20 > + for SubItem in Item['subreg']: >=20 > + Duplicate =3D AddItem(SubItem, ChkList) >=20 > + if Duplicate: >=20 > + break >=20 > + if Duplicate: >=20 > + break >=20 > + if Duplicate: >=20 > + self.Error =3D "Duplicated CFGDATA '%s' found !\n" % \ >=20 > + Duplicate['cname'] >=20 > + return -1 >=20 > + return 0 >=20 > + >=20 > + def PrintData(self): >=20 > + for Item in self._CfgItemList: >=20 > + if not Item['length']: >=20 > + continue >=20 > + print("%-10s @Offset:0x%04X Len:%3d Val:%s" % >=20 > + (Item['cname'], Item['offset'], Item['length'], >=20 > + Item['value'])) >=20 > + for SubItem in Item['subreg']: >=20 > + print(" %-20s BitOff:0x%04X BitLen:%-3d Val:%s" % >=20 > + (SubItem['cname'], SubItem['bitoffset'], >=20 > + SubItem['bitlength'], SubItem['value'])) >=20 > + >=20 > + def FormatArrayValue(self, Input, Length): >=20 > + Dat =3D self.ValueToByteArray(Input, Length) >=20 > + return ','.join('0x%02X' % Each for Each in Dat) >=20 > + >=20 > + def GetItemOptionList(self, Item): >=20 > + TmpList =3D [] >=20 > + if Item['type'] =3D=3D "Combo": >=20 > + if not Item['option'] in self._BuidinOption: >=20 > + OptList =3D Item['option'].split(',') >=20 > + for Option in OptList: >=20 > + Option =3D Option.strip() >=20 > + try: >=20 > + (OpVal, OpStr) =3D Option.split(':') >=20 > + except Exception: >=20 > + raise Exception("Invalide option format '%s' !" = % >=20 > + Option) >=20 > + TmpList.append((OpVal, OpStr)) >=20 > + return TmpList >=20 > + >=20 > + def WriteBsfStruct(self, BsfFd, Item): >=20 > + if Item['type'] =3D=3D "None": >=20 > + Space =3D "gPlatformFspPkgTokenSpaceGuid" >=20 > + else: >=20 > + Space =3D Item['space'] >=20 > + Line =3D " $%s_%s" % (Space, Item['cname']) >=20 > + Match =3D re.match("\\s*(\\{.+\\})\\s*", Item['value']) >=20 > + if Match: >=20 > + DefaultValue =3D self.FormatArrayValue(Match.group(1).strip(= ), >=20 > + Item['length']) >=20 > + else: >=20 > + DefaultValue =3D Item['value'].strip() >=20 > + if 'bitlength' in Item: >=20 > + if Item['bitlength']: >=20 > + BsfFd.write(" %s%s%4d bits $_DEFAULT_ =3D %s\n" % >=20 > + (Line, ' ' * (64 - len(Line)), Item['bitleng= th'], >=20 > + DefaultValue)) >=20 > + else: >=20 > + if Item['length']: >=20 > + BsfFd.write(" %s%s%4d bytes $_DEFAULT_ =3D %s\n" % >=20 > + (Line, ' ' * (64 - len(Line)), Item['length'= ], >=20 > + DefaultValue)) >=20 > + >=20 > + return self.GetItemOptionList(Item) >=20 > + >=20 > + def GetBsfOption(self, OptionName): >=20 > + if OptionName in self._CfgOptsDict: >=20 > + return self._CfgOptsDict[OptionName] >=20 > + else: >=20 > + return OptionName >=20 > + >=20 > + def WriteBsfOption(self, BsfFd, Item): >=20 > + PcdName =3D Item['space'] + '_' + Item['cname'] >=20 > + WriteHelp =3D 0 >=20 > + BsfLines =3D [] >=20 > + if Item['type'] =3D=3D "Combo": >=20 > + if Item['option'] in self._BuidinOption: >=20 > + Options =3D self._BuidinOption[Item['option']] >=20 > + else: >=20 > + Options =3D self.GetBsfOption(PcdName) >=20 > + BsfLines.append(' %s $%s, "%s", &%s,\n' % ( >=20 > + Item['type'], PcdName, Item['name'], Options)) >=20 > + WriteHelp =3D 1 >=20 > + elif Item['type'].startswith("EditNum"): >=20 > + Match =3D re.match("EditNum\\s*,\\s*(HEX|DEC)\\s*,\\s*\\(\ >=20 > +(\\d+|0x[0-9A-Fa-f]+)\\s*,\\s*(\\d+|0x[0-9A-Fa-f]+)\\)", Item['type']) >=20 > + if Match: >=20 > + BsfLines.append(' EditNum $%s, "%s", %s,\n' % ( >=20 > + PcdName, Item['name'], Match.group(1))) >=20 > + WriteHelp =3D 2 >=20 > + elif Item['type'].startswith("EditText"): >=20 > + BsfLines.append(' %s $%s, "%s",\n' % (Item['type'], PcdNa= me, >=20 > + Item['name'])) >=20 > + WriteHelp =3D 1 >=20 > + elif Item['type'] =3D=3D "Table": >=20 > + Columns =3D Item['option'].split(',') >=20 > + if len(Columns) !=3D 0: >=20 > + BsfLines.append(' %s $%s "%s",' % (Item['type'], PcdN= ame, >=20 > + Item['name'])) >=20 > + for Col in Columns: >=20 > + Fmt =3D Col.split(':') >=20 > + if len(Fmt) !=3D 3: >=20 > + raise Exception("Column format '%s' is invalid != " % >=20 > + Fmt) >=20 > + try: >=20 > + Dtype =3D int(Fmt[1].strip()) >=20 > + except Exception: >=20 > + raise Exception("Column size '%s' is invalid !" = % >=20 > + Fmt[1]) >=20 > + BsfLines.append('\n Column "%s", %d bytes, %s= ' % >=20 > + (Fmt[0].strip(), Dtype, Fmt[2].strip= ())) >=20 > + BsfLines.append(',\n') >=20 > + WriteHelp =3D 1 >=20 > + >=20 > + if WriteHelp > 0: >=20 > + HelpLines =3D Item['help'].split('\\n\\r') >=20 > + FirstLine =3D True >=20 > + for HelpLine in HelpLines: >=20 > + if FirstLine: >=20 > + FirstLine =3D False >=20 > + BsfLines.append(' Help "%s"\n' % (HelpLine)) >=20 > + else: >=20 > + BsfLines.append(' "%s"\n' % (HelpLine)) >=20 > + if WriteHelp =3D=3D 2: >=20 > + BsfLines.append(' "Valid range: %s ~ %s"\n' = % >=20 > + (Match.group(2), Match.group(3))) >=20 > + >=20 > + if len(Item['condition']) > 4: >=20 > + CondList =3D Item['condition'].split(',') >=20 > + Idx =3D 0 >=20 > + for Cond in CondList: >=20 > + Cond =3D Cond.strip() >=20 > + if Cond.startswith('#'): >=20 > + BsfLines.insert(Idx, Cond + '\n') >=20 > + Idx +=3D 1 >=20 > + elif Cond.startswith('@#'): >=20 > + BsfLines.append(Cond[1:] + '\n') >=20 > + >=20 > + for Line in BsfLines: >=20 > + BsfFd.write(Line) >=20 > + >=20 > + def WriteBsfPages(self, PageTree, BsfFd): >=20 > + BsfFd.write('\n') >=20 > + Key =3D next(iter(PageTree)) >=20 > + for Page in PageTree[Key]: >=20 > + PageName =3D next(iter(Page)) >=20 > + BsfFd.write('Page "%s"\n' % self._CfgPageDict[PageName]) >=20 > + if len(PageTree[Key]): >=20 > + self.WriteBsfPages(Page, BsfFd) >=20 > + >=20 > + BsfItems =3D [] >=20 > + for Item in self._CfgItemList: >=20 > + if Item['name'] !=3D '': >=20 > + if Item['page'] !=3D PageName: >=20 > + continue >=20 > + if len(Item['subreg']) > 0: >=20 > + for SubItem in Item['subreg']: >=20 > + if SubItem['name'] !=3D '': >=20 > + BsfItems.append(SubItem) >=20 > + else: >=20 > + BsfItems.append(Item) >=20 > + >=20 > + BsfItems.sort(key=3Dlambda x: x['order']) >=20 > + >=20 > + for Item in BsfItems: >=20 > + self.WriteBsfOption(BsfFd, Item) >=20 > + BsfFd.write("EndPage\n\n") >=20 > + >=20 > + def GenerateBsfFile(self, BsfFile): >=20 > + >=20 > + if BsfFile =3D=3D '': >=20 > + self.Error =3D "BSF output file '%s' is invalid" % BsfFile >=20 > + return 1 >=20 > + >=20 > + Error =3D 0 >=20 > + OptionDict =3D {} >=20 > + BsfFd =3D open(BsfFile, "w") >=20 > + BsfFd.write("%s\n" % GetCopyrightHeader('bsf')) >=20 > + BsfFd.write("%s\n" % self._GlobalDataDef) >=20 > + BsfFd.write("StructDef\n") >=20 > + NextOffset =3D -1 >=20 > + for Item in self._CfgItemList: >=20 > + if Item['find'] !=3D '': >=20 > + BsfFd.write('\n Find "%s"\n' % Item['find']) >=20 > + NextOffset =3D Item['offset'] + Item['length'] >=20 > + if Item['name'] !=3D '': >=20 > + if NextOffset !=3D Item['offset']: >=20 > + BsfFd.write(" Skip %d bytes\n" % >=20 > + (Item['offset'] - NextOffset)) >=20 > + if len(Item['subreg']) > 0: >=20 > + NextOffset =3D Item['offset'] >=20 > + BitsOffset =3D NextOffset * 8 >=20 > + for SubItem in Item['subreg']: >=20 > + BitsOffset +=3D SubItem['bitlength'] >=20 > + if SubItem['name'] =3D=3D '': >=20 > + if 'bitlength' in SubItem: >=20 > + BsfFd.write(" Skip %d bits\n" % >=20 > + (SubItem['bitlength'])) >=20 > + else: >=20 > + BsfFd.write(" Skip %d bytes\n" % >=20 > + (SubItem['length'])) >=20 > + else: >=20 > + Options =3D self.WriteBsfStruct(BsfFd, SubIt= em) >=20 > + if len(Options) > 0: >=20 > + OptionDict[SubItem >=20 > + ['space']+'_'+SubItem >=20 > + ['cname']] =3D Options >=20 > + >=20 > + NextBitsOffset =3D (Item['offset'] + Item['length'])= * 8 >=20 > + if NextBitsOffset > BitsOffset: >=20 > + BitsGap =3D NextBitsOffset - BitsOffset >=20 > + BitsRemain =3D BitsGap % 8 >=20 > + if BitsRemain: >=20 > + BsfFd.write(" Skip %d bits\n" % BitsR= emain) >=20 > + BitsGap -=3D BitsRemain >=20 > + BytesRemain =3D BitsGap // 8 >=20 > + if BytesRemain: >=20 > + BsfFd.write(" Skip %d bytes\n" % >=20 > + BytesRemain) >=20 > + NextOffset =3D Item['offset'] + Item['length'] >=20 > + else: >=20 > + NextOffset =3D Item['offset'] + Item['length'] >=20 > + Options =3D self.WriteBsfStruct(BsfFd, Item) >=20 > + if len(Options) > 0: >=20 > + OptionDict[Item['space']+'_'+Item['cname']] =3D = Options >=20 > + BsfFd.write("\nEndStruct\n\n") >=20 > + >=20 > + BsfFd.write("%s" % self._BuidinOptionTxt) >=20 > + >=20 > + NameList =3D [] >=20 > + OptionList =3D [] >=20 > + for Each in sorted(OptionDict): >=20 > + if OptionDict[Each] not in OptionList: >=20 > + NameList.append(Each) >=20 > + OptionList.append(OptionDict[Each]) >=20 > + BsfFd.write("List &%s\n" % Each) >=20 > + for Item in OptionDict[Each]: >=20 > + BsfFd.write(' Selection %s , "%s"\n' % >=20 > + (self.EvaluateExpress(Item[0]), Item[1])= ) >=20 > + BsfFd.write("EndList\n\n") >=20 > + else: >=20 > + # Item has idential options as other item >=20 > + # Try to reuse the previous options instead >=20 > + Idx =3D OptionList.index(OptionDict[Each]) >=20 > + self._CfgOptsDict[Each] =3D NameList[Idx] >=20 > + >=20 > + BsfFd.write("BeginInfoBlock\n") >=20 > + BsfFd.write(' PPVer "%s"\n' % (self._CfgBlkDict['ver'])= ) >=20 > + BsfFd.write(' Description "%s"\n' % (self._CfgBlkDict['name']= )) >=20 > + BsfFd.write("EndInfoBlock\n\n") >=20 > + >=20 > + self.WriteBsfPages(self._CfgPageTree, BsfFd) >=20 > + >=20 > + BsfFd.close() >=20 > + return Error >=20 > + >=20 > + def WriteDeltaLine(self, OutLines, Name, ValStr, IsArray): >=20 > + if IsArray: >=20 > + Output =3D '%s | { %s }' % (Name, ValStr) >=20 > + else: >=20 > + Output =3D '%s | 0x%X' % (Name, Array2Val(ValStr)) >=20 > + OutLines.append(Output) >=20 > + >=20 > + def WriteDeltaFile(self, OutFile, PlatformId, OutLines): >=20 > + DltFd =3D open(OutFile, "w") >=20 > + DltFd.write("%s\n" % GetCopyrightHeader('dlt', True)) >=20 > + if PlatformId is not None: >=20 > + DltFd.write('#\n') >=20 > + DltFd.write('# Delta configuration values \ >=20 > +for platform ID 0x%04X\n' % PlatformId) >=20 > + DltFd.write('#\n\n') >=20 > + for Line in OutLines: >=20 > + DltFd.write('%s\n' % Line) >=20 > + DltFd.close() >=20 > + >=20 > + def GenerateDeltaFile(self, OutFile, AbsfFile): >=20 > + # Parse ABSF Build in dict >=20 > + if not os.path.exists(AbsfFile): >=20 > + Lines =3D [] >=20 > + else: >=20 > + with open(AbsfFile) as Fin: >=20 > + Lines =3D Fin.readlines() >=20 > + >=20 > + AbsfBuiltValDict =3D {} >=20 > + Process =3D False >=20 > + for Line in Lines: >=20 > + Line =3D Line.strip() >=20 > + if Line.startswith('StructDef'): >=20 > + Process =3D True >=20 > + if Line.startswith('EndStruct'): >=20 > + break >=20 > + if not Process: >=20 > + continue >=20 > + Match =3D re.match('\\s*\\$gCfgData_(\\w+)\\s+\ >=20 > +(\\d+)\\s+(bits|bytes)\\s+\\$_AS_BUILT_\\s+=3D\\s+(.+)\\$', Line) >=20 > + if Match: >=20 > + if Match.group(1) not in AbsfBuiltValDict: >=20 > + AbsfBuiltValDict[Match.group(1)] =3D Match.group(4).= strip() >=20 > + else: >=20 > + raise Exception("Duplicated configuration \ >=20 > +name '%s' found !", Match.group(1)) >=20 > + >=20 > + # Match config item in DSC >=20 > + PlatformId =3D None >=20 > + OutLines =3D [] >=20 > + TagName =3D '' >=20 > + Level =3D 0 >=20 > + for Item in self._CfgItemList: >=20 > + Name =3D None >=20 > + if Level =3D=3D 0 and Item['embed'].endswith(':START'): >=20 > + TagName =3D Item['embed'].split(':')[0] >=20 > + Level +=3D 1 >=20 > + if Item['cname'] in AbsfBuiltValDict: >=20 > + ValStr =3D AbsfBuiltValDict[Item['cname']] >=20 > + Name =3D '%s.%s' % (TagName, Item['cname']) >=20 > + if not Item['subreg'] and Item['value'].startswith('{'): >=20 > + Value =3D Array2Val(Item['value']) >=20 > + IsArray =3D True >=20 > + else: >=20 > + Value =3D int(Item['value'], 16) >=20 > + IsArray =3D False >=20 > + AbsfVal =3D Array2Val(ValStr) >=20 > + if AbsfVal !=3D Value: >=20 > + if 'PLATFORMID_CFG_DATA.PlatformId' =3D=3D Name: >=20 > + PlatformId =3D AbsfVal >=20 > + self.WriteDeltaLine(OutLines, Name, ValStr, IsArray) >=20 > + else: >=20 > + if 'PLATFORMID_CFG_DATA.PlatformId' =3D=3D Name: >=20 > + raise Exception("'PlatformId' has the \ >=20 > +same value as DSC default !") >=20 > + >=20 > + if Item['subreg']: >=20 > + for SubItem in Item['subreg']: >=20 > + if SubItem['cname'] in AbsfBuiltValDict: >=20 > + ValStr =3D AbsfBuiltValDict[SubItem['cname']] >=20 > + if Array2Val(ValStr) =3D=3D int(SubItem['value']= , 16): >=20 > + continue >=20 > + Name =3D '%s.%s.%s' % (TagName, Item['cname'], >=20 > + SubItem['cname']) >=20 > + self.WriteDeltaLine(OutLines, Name, ValStr, Fals= e) >=20 > + >=20 > + if Item['embed'].endswith(':END'): >=20 > + Level -=3D 1 >=20 > + >=20 > + if PlatformId is None and Lines: >=20 > + raise Exception("'PlatformId' configuration \ >=20 > +is missing in ABSF file!") >=20 > + else: >=20 > + PlatformId =3D 0 >=20 > + >=20 > + self.WriteDeltaFile(OutFile, PlatformId, Lines) >=20 > + >=20 > + return 0 >=20 > + >=20 > + def GenerateDscFile(self, OutFile): >=20 > + DscFd =3D open(OutFile, "w") >=20 > + for Line in self._DscLines: >=20 > + DscFd.write(Line + '\n') >=20 > + DscFd.close() >=20 > + return 0 >=20 > + >=20 > + >=20 > +def Usage(): >=20 > + print('\n'.join([ >=20 > + "GenCfgData Version 0.01", >=20 > + "Usage:", >=20 > + " GenCfgData GENINC BinFile \ >=20 > +IncOutFile [-D Macros]", >=20 > + " GenCfgData GENPKL DscFile \ >=20 > +PklOutFile [-D Macros]", >=20 > + " GenCfgData GENINC DscFile[;DltFile] \ >=20 > +IncOutFile [-D Macros]", >=20 > + " GenCfgData GENBIN DscFile[;DltFile] \ >=20 > +BinOutFile [-D Macros]", >=20 > + " GenCfgData GENBSF DscFile[;DltFile] \ >=20 > +BsfOutFile [-D Macros]", >=20 > + " GenCfgData GENDLT DscFile[;AbsfFile] \ >=20 > +DltOutFile [-D Macros]", >=20 > + " GenCfgData GENDSC DscFile \ >=20 > +DscOutFile [-D Macros]", >=20 > + " GenCfgData GENHDR DscFile[;DltFile] \ >=20 > +HdrOutFile[;ComHdrOutFile] [-D Macros]" >=20 > + ])) >=20 > + >=20 > + >=20 > +def Main(): >=20 > + # >=20 > + # Parse the options and args >=20 > + # >=20 > + argc =3D len(sys.argv) >=20 > + if argc < 4: >=20 > + Usage() >=20 > + return 1 >=20 > + >=20 > + GenCfgData =3D CGenCfgData() >=20 > + Command =3D sys.argv[1].upper() >=20 > + OutFile =3D sys.argv[3] >=20 > + >=20 > + if argc > 5 and GenCfgData.ParseMacros(sys.argv[4:]) !=3D 0: >=20 > + raise Exception("ERROR: Macro parsing failed !") >=20 > + >=20 > + FileList =3D sys.argv[2].split(';') >=20 > + if len(FileList) =3D=3D 2: >=20 > + DscFile =3D FileList[0] >=20 > + DltFile =3D FileList[1] >=20 > + elif len(FileList) =3D=3D 1: >=20 > + DscFile =3D FileList[0] >=20 > + DltFile =3D '' >=20 > + else: >=20 > + raise Exception("ERROR: Invalid parameter '%s' !" % sys.argv[2]) >=20 > + >=20 > + if Command =3D=3D "GENDLT" and DscFile.endswith('.dlt'): >=20 > + # It needs to expand an existing DLT file >=20 > + DltFile =3D DscFile >=20 > + Lines =3D CGenCfgData.ExpandIncludeFiles(DltFile) >=20 > + OutTxt =3D ''.join([x[0] for x in Lines]) >=20 > + OutFile =3D open(OutFile, "w") >=20 > + OutFile.write(OutTxt) >=20 > + OutFile.close() >=20 > + return 0 >=20 > + >=20 > + if not os.path.exists(DscFile): >=20 > + raise Exception("ERROR: Cannot open file '%s' !" % DscFile) >=20 > + >=20 > + CfgBinFile =3D '' >=20 > + if DltFile: >=20 > + if not os.path.exists(DltFile): >=20 > + raise Exception("ERROR: Cannot open file '%s' !" % DltFile) >=20 > + if Command =3D=3D "GENDLT": >=20 > + CfgBinFile =3D DltFile >=20 > + DltFile =3D '' >=20 > + >=20 > + BinFile =3D '' >=20 > + if (DscFile.lower().endswith('.bin')) and (Command =3D=3D "GENINC"): >=20 > + # It is binary file >=20 > + BinFile =3D DscFile >=20 > + DscFile =3D '' >=20 > + >=20 > + if BinFile: >=20 > + if GenCfgData.GenerateDataIncFile(OutFile, BinFile) !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + return 0 >=20 > + >=20 > + if DscFile.lower().endswith('.pkl'): >=20 > + with open(DscFile, "rb") as PklFile: >=20 > + GenCfgData.__dict__ =3D marshal.load(PklFile) >=20 > + else: >=20 > + if GenCfgData.ParseDscFile(DscFile) !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + # if GenCfgData.CheckCfgData() !=3D 0: >=20 > + # raise Exception(GenCfgData.Error) >=20 > + >=20 > + if GenCfgData.CreateVarDict() !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + if Command =3D=3D 'GENPKL': >=20 > + with open(OutFile, "wb") as PklFile: >=20 > + marshal.dump(GenCfgData.__dict__, PklFile) >=20 > + return 0 >=20 > + >=20 > + if DltFile and Command in ['GENHDR', 'GENBIN', 'GENINC', 'GENBSF']: >=20 > + if GenCfgData.OverrideDefaultValue(DltFile) !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + if GenCfgData.UpdateDefaultValue() !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + # GenCfgData.PrintData () >=20 > + >=20 > + if sys.argv[1] =3D=3D "GENBIN": >=20 > + if GenCfgData.GenerateBinary(OutFile) !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + elif sys.argv[1] =3D=3D "GENHDR": >=20 > + OutFiles =3D OutFile.split(';') >=20 > + BrdOutFile =3D OutFiles[0].strip() >=20 > + if len(OutFiles) > 1: >=20 > + ComOutFile =3D OutFiles[1].strip() >=20 > + else: >=20 > + ComOutFile =3D '' >=20 > + if GenCfgData.CreateHeaderFile(BrdOutFile, ComOutFile) !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + elif sys.argv[1] =3D=3D "GENBSF": >=20 > + if GenCfgData.GenerateBsfFile(OutFile) !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + elif sys.argv[1] =3D=3D "GENINC": >=20 > + if GenCfgData.GenerateDataIncFile(OutFile) !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + elif sys.argv[1] =3D=3D "GENDLT": >=20 > + if GenCfgData.GenerateDeltaFile(OutFile, CfgBinFile) !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + elif sys.argv[1] =3D=3D "GENDSC": >=20 > + if GenCfgData.GenerateDscFile(OutFile) !=3D 0: >=20 > + raise Exception(GenCfgData.Error) >=20 > + >=20 > + else: >=20 > + raise Exception("Unsuported command '%s' !" % Command) >=20 > + >=20 > + return 0 >=20 > + >=20 > + >=20 > +if __name__ =3D=3D '__main__': >=20 > + sys.exit(Main()) >=20 > diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py > b/IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py > new file mode 100644 > index 0000000000..2b6cbc6eb5 > --- /dev/null > +++ b/IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py > @@ -0,0 +1,2241 @@ > +# @ GenYamlCfg.py >=20 > +# >=20 > +# Copyright (c) 2020, Intel Corporation. All rights reserved.
>=20 > +# SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > +# >=20 > +# >=20 > + >=20 > +import os >=20 > +import sys >=20 > +import re >=20 > +import marshal >=20 > +import string >=20 > +import operator as op >=20 > +import ast >=20 > + >=20 > +from datetime import date >=20 > +from collections import OrderedDict >=20 > +from CommonUtility import value_to_bytearray, value_to_bytes, \ >=20 > + bytes_to_value, get_bits_from_bytes, set_bits_to_bytes >=20 > + >=20 > +# Generated file copyright header >=20 > +__copyright_tmp__ =3D """/** @file >=20 > + >=20 > + Platform Configuration %s File. >=20 > + >=20 > + Copyright (c) %4d, Intel Corporation. All rights reserved.
>=20 > + SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > + >=20 > + This file is automatically generated. Please do NOT modify !!! >=20 > + >=20 > +**/ >=20 > +""" >=20 > + >=20 > + >=20 > +def get_copyright_header(file_type, allow_modify=3DFalse): >=20 > + file_description =3D { >=20 > + 'yaml': 'Boot Setting', >=20 > + 'dlt': 'Delta', >=20 > + 'inc': 'C Binary Blob', >=20 > + 'h': 'C Struct Header' >=20 > + } >=20 > + if file_type in ['yaml', 'dlt']: >=20 > + comment_char =3D '#' >=20 > + else: >=20 > + comment_char =3D '' >=20 > + lines =3D __copyright_tmp__.split('\n') >=20 > + if allow_modify: >=20 > + lines =3D [line for line in lines if 'Please do NOT modify' not = in line] >=20 > + copyright_hdr =3D '\n'.join('%s%s' % (comment_char, line) >=20 > + for line in lines)[:-1] + '\n' >=20 > + return copyright_hdr % (file_description[file_type], date.today().ye= ar) >=20 > + >=20 > + >=20 > +def check_quote(text): >=20 > + if (text[0] =3D=3D "'" and text[-1] =3D=3D "'") or (text[0] =3D=3D '= "' >=20 > + and text[-1] =3D=3D '"')= : >=20 > + return True >=20 > + return False >=20 > + >=20 > + >=20 > +def strip_quote(text): >=20 > + new_text =3D text.strip() >=20 > + if check_quote(new_text): >=20 > + return new_text[1:-1] >=20 > + return text >=20 > + >=20 > + >=20 > +def strip_delimiter(text, delim): >=20 > + new_text =3D text.strip() >=20 > + if new_text: >=20 > + if new_text[0] =3D=3D delim[0] and new_text[-1] =3D=3D delim[-1]= : >=20 > + return new_text[1:-1] >=20 > + return text >=20 > + >=20 > + >=20 > +def bytes_to_bracket_str(bytes): >=20 > + return '{ %s }' % (', '.join('0x%02x' % i for i in bytes)) >=20 > + >=20 > + >=20 > +def array_str_to_value(val_str): >=20 > + val_str =3D val_str.strip() >=20 > + val_str =3D strip_delimiter(val_str, '{}') >=20 > + val_str =3D strip_quote(val_str) >=20 > + value =3D 0 >=20 > + for each in val_str.split(',')[::-1]: >=20 > + each =3D each.strip() >=20 > + value =3D (value << 8) | int(each, 0) >=20 > + return value >=20 > + >=20 > + >=20 > +def write_lines(lines, file): >=20 > + fo =3D open(file, "w") >=20 > + fo.write(''.join([x[0] for x in lines])) >=20 > + fo.close() >=20 > + >=20 > + >=20 > +def read_lines(file): >=20 > + if not os.path.exists(file): >=20 > + test_file =3D os.path.basename(file) >=20 > + if os.path.exists(test_file): >=20 > + file =3D test_file >=20 > + fi =3D open(file, 'r') >=20 > + lines =3D fi.readlines() >=20 > + fi.close() >=20 > + return lines >=20 > + >=20 > + >=20 > +def expand_file_value(path, value_str): >=20 > + result =3D bytearray() >=20 > + match =3D re.match("\\{\\s*FILE:(.+)\\}", value_str) >=20 > + if match: >=20 > + file_list =3D match.group(1).split(',') >=20 > + for file in file_list: >=20 > + file =3D file.strip() >=20 > + bin_path =3D os.path.join(path, file) >=20 > + result.extend(bytearray(open(bin_path, 'rb').read())) >=20 > + return result >=20 > + >=20 > + >=20 > +class ExpressionEval(ast.NodeVisitor): >=20 > + operators =3D { >=20 > + ast.Add: op.add, >=20 > + ast.Sub: op.sub, >=20 > + ast.Mult: op.mul, >=20 > + ast.Div: op.floordiv, >=20 > + ast.Mod: op.mod, >=20 > + ast.Eq: op.eq, >=20 > + ast.NotEq: op.ne, >=20 > + ast.Gt: op.gt, >=20 > + ast.Lt: op.lt, >=20 > + ast.GtE: op.ge, >=20 > + ast.LtE: op.le, >=20 > + ast.BitXor: op.xor, >=20 > + ast.BitAnd: op.and_, >=20 > + ast.BitOr: op.or_, >=20 > + ast.Invert: op.invert, >=20 > + ast.USub: op.neg >=20 > + } >=20 > + >=20 > + def __init__(self): >=20 > + self._debug =3D False >=20 > + self._expression =3D '' >=20 > + self._namespace =3D {} >=20 > + self._get_variable =3D None >=20 > + >=20 > + def eval(self, expr, vars=3D{}): >=20 > + self._expression =3D expr >=20 > + if type(vars) is dict: >=20 > + self._namespace =3D vars >=20 > + self._get_variable =3D None >=20 > + else: >=20 > + self._namespace =3D {} >=20 > + self._get_variable =3D vars >=20 > + node =3D ast.parse(self._expression, mode=3D'eval') >=20 > + result =3D self.visit(node.body) >=20 > + if self._debug: >=20 > + print('EVAL [ %s ] =3D %s' % (expr, str(result))) >=20 > + return result >=20 > + >=20 > + def visit_Name(self, node): >=20 > + if self._get_variable is not None: >=20 > + return self._get_variable(node.id) >=20 > + else: >=20 > + return self._namespace[node.id] >=20 > + >=20 > + def visit_Num(self, node): >=20 > + return node.n >=20 > + >=20 > + def visit_NameConstant(self, node): >=20 > + return node.value >=20 > + >=20 > + def visit_BoolOp(self, node): >=20 > + result =3D False >=20 > + if isinstance(node.op, ast.And): >=20 > + for value in node.values: >=20 > + result =3D self.visit(value) >=20 > + if not result: >=20 > + break >=20 > + elif isinstance(node.op, ast.Or): >=20 > + for value in node.values: >=20 > + result =3D self.visit(value) >=20 > + if result: >=20 > + break >=20 > + return True if result else False >=20 > + >=20 > + def visit_UnaryOp(self, node): >=20 > + val =3D self.visit(node.operand) >=20 > + return ExpressionEval.operators[type(node.op)](val) >=20 > + >=20 > + def visit_BinOp(self, node): >=20 > + lhs =3D self.visit(node.left) >=20 > + rhs =3D self.visit(node.right) >=20 > + return ExpressionEval.operators[type(node.op)](lhs, rhs) >=20 > + >=20 > + def visit_Compare(self, node): >=20 > + right =3D self.visit(node.left) >=20 > + result =3D True >=20 > + for operation, comp in zip(node.ops, node.comparators): >=20 > + if not result: >=20 > + break >=20 > + left =3D right >=20 > + right =3D self.visit(comp) >=20 > + result =3D ExpressionEval.operators[type(operation)](left, r= ight) >=20 > + return result >=20 > + >=20 > + def visit_Call(self, node): >=20 > + if node.func.id in ['ternary']: >=20 > + condition =3D self.visit(node.args[0]) >=20 > + val_true =3D self.visit(node.args[1]) >=20 > + val_false =3D self.visit(node.args[2]) >=20 > + return val_true if condition else val_false >=20 > + elif node.func.id in ['offset', 'length']: >=20 > + if self._get_variable is not None: >=20 > + return self._get_variable(node.args[0].s, node.func.id) >=20 > + else: >=20 > + raise ValueError("Unsupported function: " + repr(node)) >=20 > + >=20 > + def generic_visit(self, node): >=20 > + raise ValueError("malformed node or string: " + repr(node)) >=20 > + >=20 > + >=20 > +class CFG_YAML(): >=20 > + TEMPLATE =3D 'template' >=20 > + CONFIGS =3D 'configs' >=20 > + VARIABLE =3D 'variable' >=20 > + >=20 > + def __init__(self): >=20 > + self.log_line =3D False >=20 > + self.allow_template =3D False >=20 > + self.cfg_tree =3D None >=20 > + self.tmp_tree =3D None >=20 > + self.var_dict =3D None >=20 > + self.def_dict =3D {} >=20 > + self.yaml_path =3D '' >=20 > + self.lines =3D [] >=20 > + self.full_lines =3D [] >=20 > + self.index =3D 0 >=20 > + self.re_expand =3D re.compile( >=20 > + r'(.+:\s+|\s*\-\s*)!expand\s+\{\s*(\w+_TMPL)\s*:\s*\[(.+)]\s= *\}') >=20 > + self.re_include =3D re.compile(r'(.+:\s+|\s*\-\s*)!include\s+(.+= )') >=20 > + >=20 > + @staticmethod >=20 > + def count_indent(line): >=20 > + return next((i for i, c in enumerate(line) if not c.isspace()), >=20 > + len(line)) >=20 > + >=20 > + @staticmethod >=20 > + def substitue_args(text, arg_dict): >=20 > + for arg in arg_dict: >=20 > + text =3D text.replace('$' + arg, arg_dict[arg]) >=20 > + return text >=20 > + >=20 > + @staticmethod >=20 > + def dprint(*args): >=20 > + pass >=20 > + >=20 > + def process_include(self, line, insert=3DTrue): >=20 > + match =3D self.re_include.match(line) >=20 > + if not match: >=20 > + raise Exception("Invalid !include format '%s' !" % line.stri= p()) >=20 > + >=20 > + prefix =3D match.group(1) >=20 > + include =3D match.group(2) >=20 > + if prefix.strip() =3D=3D '-': >=20 > + prefix =3D '' >=20 > + adjust =3D 0 >=20 > + else: >=20 > + adjust =3D 2 >=20 > + >=20 > + include =3D strip_quote(include) >=20 > + request =3D CFG_YAML.count_indent(line) + adjust >=20 > + >=20 > + if self.log_line: >=20 > + # remove the include line itself >=20 > + del self.full_lines[-1] >=20 > + >=20 > + inc_path =3D os.path.join(self.yaml_path, include) >=20 > + if not os.path.exists(inc_path): >=20 > + # try relative path to project root >=20 > + try_path =3D os.path.join(os.path.dirname(os.path.realpath(_= _file__) >=20 > + ), "../..", include) >=20 > + if os.path.exists(try_path): >=20 > + inc_path =3D try_path >=20 > + else: >=20 > + raise Exception("ERROR: Cannot open file '%s'." % inc_pa= th) >=20 > + >=20 > + lines =3D read_lines(inc_path) >=20 > + current =3D 0 >=20 > + same_line =3D False >=20 > + for idx, each in enumerate(lines): >=20 > + start =3D each.lstrip() >=20 > + if start =3D=3D '' or start[0] =3D=3D '#': >=20 > + continue >=20 > + >=20 > + if start[0] =3D=3D '>': >=20 > + # append the content directly at the same line >=20 > + same_line =3D True >=20 > + >=20 > + start =3D idx >=20 > + current =3D CFG_YAML.count_indent(each) >=20 > + break >=20 > + >=20 > + lines =3D lines[start+1:] if same_line else lines[start:] >=20 > + leading =3D '' >=20 > + if same_line: >=20 > + request =3D len(prefix) >=20 > + leading =3D '>' >=20 > + >=20 > + lines =3D [prefix + '%s\n' % leading] + [' ' * request + >=20 > + i[current:] for i in line= s] >=20 > + if insert: >=20 > + self.lines =3D lines + self.lines >=20 > + >=20 > + return lines >=20 > + >=20 > + def process_expand(self, line): >=20 > + match =3D self.re_expand.match(line) >=20 > + if not match: >=20 > + raise Exception("Invalid !expand format '%s' !" % line.strip= ()) >=20 > + lines =3D [] >=20 > + prefix =3D match.group(1) >=20 > + temp_name =3D match.group(2) >=20 > + args =3D match.group(3) >=20 > + >=20 > + if prefix.strip() =3D=3D '-': >=20 > + indent =3D 0 >=20 > + else: >=20 > + indent =3D 2 >=20 > + lines =3D self.process_expand_template(temp_name, prefix, args, = indent) >=20 > + self.lines =3D lines + self.lines >=20 > + >=20 > + def process_expand_template(self, temp_name, prefix, args, indent=3D= 2): >=20 > + # expand text with arg substitution >=20 > + if temp_name not in self.tmp_tree: >=20 > + raise Exception("Could not find template '%s' !" % temp_name= ) >=20 > + parts =3D args.split(',') >=20 > + parts =3D [i.strip() for i in parts] >=20 > + num =3D len(parts) >=20 > + arg_dict =3D dict(zip(['(%d)' % (i + 1) for i in range(num)], pa= rts)) >=20 > + str_data =3D self.tmp_tree[temp_name] >=20 > + text =3D DefTemplate(str_data).safe_substitute(self.def_dict) >=20 > + text =3D CFG_YAML.substitue_args(text, arg_dict) >=20 > + target =3D CFG_YAML.count_indent(prefix) + indent >=20 > + current =3D CFG_YAML.count_indent(text) >=20 > + padding =3D target * ' ' >=20 > + if indent =3D=3D 0: >=20 > + leading =3D [] >=20 > + else: >=20 > + leading =3D [prefix + '\n'] >=20 > + text =3D leading + [(padding + i + '\n')[current:] >=20 > + for i in text.splitlines()] >=20 > + return text >=20 > + >=20 > + def load_file(self, yaml_file): >=20 > + self.index =3D 0 >=20 > + self.lines =3D read_lines(yaml_file) >=20 > + >=20 > + def peek_line(self): >=20 > + if len(self.lines) =3D=3D 0: >=20 > + return None >=20 > + else: >=20 > + return self.lines[0] >=20 > + >=20 > + def put_line(self, line): >=20 > + self.lines.insert(0, line) >=20 > + if self.log_line: >=20 > + del self.full_lines[-1] >=20 > + >=20 > + def get_line(self): >=20 > + if len(self.lines) =3D=3D 0: >=20 > + return None >=20 > + else: >=20 > + line =3D self.lines.pop(0) >=20 > + if self.log_line: >=20 > + self.full_lines.append(line.rstrip()) >=20 > + return line >=20 > + >=20 > + def get_multiple_line(self, indent): >=20 > + text =3D '' >=20 > + newind =3D indent + 1 >=20 > + while True: >=20 > + line =3D self.peek_line() >=20 > + if line is None: >=20 > + break >=20 > + sline =3D line.strip() >=20 > + if sline !=3D '': >=20 > + newind =3D CFG_YAML.count_indent(line) >=20 > + if newind <=3D indent: >=20 > + break >=20 > + self.get_line() >=20 > + if sline !=3D '': >=20 > + text =3D text + line >=20 > + return text >=20 > + >=20 > + def traverse_cfg_tree(self, handler): >=20 > + def _traverse_cfg_tree(root, level=3D0): >=20 > + # config structure >=20 > + for key in root: >=20 > + if type(root[key]) is OrderedDict: >=20 > + level +=3D 1 >=20 > + handler(key, root[key], level) >=20 > + _traverse_cfg_tree(root[key], level) >=20 > + level -=3D 1 >=20 > + _traverse_cfg_tree(self.cfg_tree) >=20 > + >=20 > + def count(self): >=20 > + def _count(name, cfgs, level): >=20 > + num[0] +=3D 1 >=20 > + num =3D [0] >=20 > + self.traverse_cfg_tree(_count) >=20 > + return num[0] >=20 > + >=20 > + def parse(self, parent_name=3D'', curr=3DNone, level=3D0): >=20 > + child =3D None >=20 > + last_indent =3D None >=20 > + key =3D '' >=20 > + temp_chk =3D {} >=20 > + >=20 > + while True: >=20 > + line =3D self.get_line() >=20 > + if line is None: >=20 > + break >=20 > + >=20 > + curr_line =3D line.strip() >=20 > + if curr_line =3D=3D '' or curr_line[0] =3D=3D '#': >=20 > + continue >=20 > + >=20 > + indent =3D CFG_YAML.count_indent(line) >=20 > + if last_indent is None: >=20 > + last_indent =3D indent >=20 > + >=20 > + if indent !=3D last_indent: >=20 > + # outside of current block, put the line back to queue >=20 > + self.put_line(' ' * indent + curr_line) >=20 > + >=20 > + if curr_line.endswith(': >'): >=20 > + # multiline marker >=20 > + old_count =3D len(self.full_lines) >=20 > + line =3D self.get_multiple_line(indent) >=20 > + if self.log_line and not self.allow_template \ >=20 > + and '!include ' in line: >=20 > + # expand include in template >=20 > + new_lines =3D [] >=20 > + lines =3D line.splitlines() >=20 > + for idx, each in enumerate(lines): >=20 > + if '!include ' in each: >=20 > + new_line =3D ''.join(self.process_include(ea= ch, >=20 > + Fals= e)) >=20 > + new_lines.append(new_line) >=20 > + else: >=20 > + new_lines.append(each) >=20 > + self.full_lines =3D self.full_lines[:old_count] + ne= w_lines >=20 > + curr_line =3D curr_line + line >=20 > + >=20 > + if indent > last_indent: >=20 > + # child nodes >=20 > + if child is None: >=20 > + raise Exception('Unexpected format at line: %s' >=20 > + % (curr_line)) >=20 > + >=20 > + level +=3D 1 >=20 > + self.parse(key, child, level) >=20 > + level -=3D 1 >=20 > + line =3D self.peek_line() >=20 > + if line is not None: >=20 > + curr_line =3D line.strip() >=20 > + indent =3D CFG_YAML.count_indent(line) >=20 > + if indent >=3D last_indent: >=20 > + # consume the line >=20 > + self.get_line() >=20 > + else: >=20 > + # end of file >=20 > + indent =3D -1 >=20 > + >=20 > + if curr is None: >=20 > + curr =3D OrderedDict() >=20 > + >=20 > + if indent < last_indent: >=20 > + return curr >=20 > + >=20 > + marker1 =3D curr_line[0] >=20 > + marker2 =3D curr_line[-1] >=20 > + start =3D 1 if marker1 =3D=3D '-' else 0 >=20 > + pos =3D curr_line.find(': ') >=20 > + if pos > 0: >=20 > + child =3D None >=20 > + key =3D curr_line[start:pos].strip() >=20 > + if curr_line[pos + 2] =3D=3D '>': >=20 > + curr[key] =3D curr_line[pos + 3:] >=20 > + else: >=20 > + # XXXX: !include / !expand >=20 > + if '!include ' in curr_line: >=20 > + self.process_include(line) >=20 > + elif '!expand ' in curr_line: >=20 > + if self.allow_template and not self.log_line: >=20 > + self.process_expand(line) >=20 > + else: >=20 > + value_str =3D curr_line[pos + 2:].strip() >=20 > + curr[key] =3D value_str >=20 > + if self.log_line and value_str[0] =3D=3D '{': >=20 > + # expand {FILE: xxxx} format in the log line >=20 > + if value_str[1:].rstrip().startswith('FILE:'= ): >=20 > + value_bytes =3D expand_file_value( >=20 > + self.yaml_path, value_str) >=20 > + value_str =3D bytes_to_bracket_str(value= _bytes) >=20 > + self.full_lines[-1] =3D line[ >=20 > + :indent] + curr_line[:pos + 2] + val= ue_str >=20 > + >=20 > + elif marker2 =3D=3D ':': >=20 > + child =3D OrderedDict() >=20 > + key =3D curr_line[start:-1].strip() >=20 > + if key =3D=3D '$ACTION': >=20 > + # special virtual nodes, rename to ensure unique key >=20 > + key =3D '$ACTION_%04X' % self.index >=20 > + self.index +=3D 1 >=20 > + if key in curr: >=20 > + if key not in temp_chk: >=20 > + # check for duplicated keys at same level >=20 > + temp_chk[key] =3D 1 >=20 > + else: >=20 > + raise Exception("Duplicated item '%s:%s' found != " >=20 > + % (parent_name, key)) >=20 > + >=20 > + curr[key] =3D child >=20 > + if self.var_dict is None and key =3D=3D CFG_YAML.VARIABL= E: >=20 > + self.var_dict =3D child >=20 > + if self.tmp_tree is None and key =3D=3D CFG_YAML.TEMPLAT= E: >=20 > + self.tmp_tree =3D child >=20 > + if self.var_dict: >=20 > + for each in self.var_dict: >=20 > + txt =3D self.var_dict[each] >=20 > + if type(txt) is str: >=20 > + self.def_dict['(%s)' % each] =3D txt >=20 > + if self.tmp_tree and key =3D=3D CFG_YAML.CONFIGS: >=20 > + # apply template for the main configs >=20 > + self.allow_template =3D True >=20 > + else: >=20 > + child =3D None >=20 > + # - !include cfg_opt.yaml >=20 > + if '!include ' in curr_line: >=20 > + self.process_include(line) >=20 > + >=20 > + return curr >=20 > + >=20 > + def load_yaml(self, opt_file): >=20 > + self.var_dict =3D None >=20 > + self.yaml_path =3D os.path.dirname(opt_file) >=20 > + self.load_file(opt_file) >=20 > + yaml_tree =3D self.parse() >=20 > + self.tmp_tree =3D yaml_tree[CFG_YAML.TEMPLATE] >=20 > + self.cfg_tree =3D yaml_tree[CFG_YAML.CONFIGS] >=20 > + return self.cfg_tree >=20 > + >=20 > + def expand_yaml(self, opt_file): >=20 > + self.log_line =3D True >=20 > + self.load_yaml(opt_file) >=20 > + self.log_line =3D False >=20 > + text =3D '\n'.join(self.full_lines) >=20 > + self.full_lines =3D [] >=20 > + return text >=20 > + >=20 > + >=20 > +class DefTemplate(string.Template): >=20 > + idpattern =3D '\\([_A-Z][_A-Z0-9]*\\)|[_A-Z][_A-Z0-9]*' >=20 > + >=20 > + >=20 > +class CGenYamlCfg: >=20 > + STRUCT =3D '$STRUCT' >=20 > + bits_width =3D {'b': 1, 'B': 8, 'W': 16, 'D': 32, 'Q': 64} >=20 > + builtin_option =3D {'$EN_DIS': [('0', 'Disable'), ('1', 'Enable')]} >=20 > + exclude_struct =3D ['FSP_UPD_HEADER', 'FSPT_ARCH_UPD', >=20 > + 'FSPM_ARCH_UPD', 'FSPS_ARCH_UPD', >=20 > + 'GPIO_GPP_*', 'GPIO_CFG_DATA', >=20 > + 'GpioConfPad*', 'GpioPinConfig', >=20 > + 'BOOT_OPTION*', 'PLATFORMID_CFG_DATA', '\\w+_Half[= 01]'] >=20 > + include_tag =3D ['GPIO_CFG_DATA'] >=20 > + keyword_set =3D set(['name', 'type', 'option', 'help', 'length', >=20 > + 'value', 'order', 'struct', 'condition']) >=20 > + >=20 > + def __init__(self): >=20 > + self._mode =3D '' >=20 > + self._debug =3D False >=20 > + self._macro_dict =3D {} >=20 > + self.initialize() >=20 > + >=20 > + def initialize(self): >=20 > + self._old_bin =3D None >=20 > + self._cfg_tree =3D {} >=20 > + self._tmp_tree =3D {} >=20 > + self._cfg_list =3D [] >=20 > + self._cfg_page =3D {'root': {'title': '', 'child': []}} >=20 > + self._cur_page =3D '' >=20 > + self._var_dict =3D {} >=20 > + self._def_dict =3D {} >=20 > + self._yaml_path =3D '' >=20 > + >=20 > + @staticmethod >=20 > + def deep_convert_dict(layer): >=20 > + # convert OrderedDict to list + dict >=20 > + new_list =3D layer >=20 > + if isinstance(layer, OrderedDict): >=20 > + new_list =3D list(layer.items()) >=20 > + for idx, pair in enumerate(new_list): >=20 > + new_node =3D CGenYamlCfg.deep_convert_dict(pair[1]) >=20 > + new_list[idx] =3D dict({pair[0]: new_node}) >=20 > + return new_list >=20 > + >=20 > + @staticmethod >=20 > + def deep_convert_list(layer): >=20 > + if isinstance(layer, list): >=20 > + od =3D OrderedDict({}) >=20 > + for each in layer: >=20 > + if isinstance(each, dict): >=20 > + key =3D next(iter(each)) >=20 > + od[key] =3D CGenYamlCfg.deep_convert_list(each[key]) >=20 > + return od >=20 > + else: >=20 > + return layer >=20 > + >=20 > + @staticmethod >=20 > + def expand_include_files(file_path, cur_dir=3D''): >=20 > + if cur_dir =3D=3D '': >=20 > + cur_dir =3D os.path.dirname(file_path) >=20 > + file_path =3D os.path.basename(file_path) >=20 > + >=20 > + input_file_path =3D os.path.join(cur_dir, file_path) >=20 > + file =3D open(input_file_path, "r") >=20 > + lines =3D file.readlines() >=20 > + file.close() >=20 > + new_lines =3D [] >=20 > + for line_num, line in enumerate(lines): >=20 > + match =3D re.match("^!include\\s*(.+)?$", line.strip()) >=20 > + if match: >=20 > + inc_path =3D match.group(1) >=20 > + tmp_path =3D os.path.join(cur_dir, inc_path) >=20 > + org_path =3D tmp_path >=20 > + if not os.path.exists(tmp_path): >=20 > + cur_dir =3D os.path.join(os.path.dirname >=20 > + (os.path.realpath(__file__) >=20 > + ), "..", "..") >=20 > + tmp_path =3D os.path.join(cur_dir, inc_path) >=20 > + if not os.path.exists(tmp_path): >=20 > + raise Exception("ERROR: Cannot open include\ >=20 > + file '%s'." % org_path) >=20 > + else: >=20 > + new_lines.append(('# Included from file: %s\n' % inc= _path, >=20 > + tmp_path, 0)) >=20 > + new_lines.append(('# %s\n' % ('=3D' * 80), tmp_path,= 0)) >=20 > + new_lines.extend(CGenYamlCfg.expand_include_files >=20 > + (inc_path, cur_dir)) >=20 > + else: >=20 > + new_lines.append((line, input_file_path, line_num)) >=20 > + >=20 > + return new_lines >=20 > + >=20 > + @staticmethod >=20 > + def format_struct_field_name(input, count=3D0): >=20 > + name =3D '' >=20 > + cap =3D True >=20 > + if '_' in input: >=20 > + input =3D input.lower() >=20 > + for each in input: >=20 > + if each =3D=3D '_': >=20 > + cap =3D True >=20 > + continue >=20 > + elif cap: >=20 > + each =3D each.upper() >=20 > + cap =3D False >=20 > + name =3D name + each >=20 > + >=20 > + if count > 1: >=20 > + name =3D '%s[%d]' % (name, count) >=20 > + >=20 > + return name >=20 > + >=20 > + def get_mode(self): >=20 > + return self._mode >=20 > + >=20 > + def set_mode(self, mode): >=20 > + self._mode =3D mode >=20 > + >=20 > + def get_last_error(self): >=20 > + return '' >=20 > + >=20 > + def get_variable(self, var, attr=3D'value'): >=20 > + if var in self._var_dict: >=20 > + var =3D self._var_dict[var] >=20 > + return var >=20 > + >=20 > + item =3D self.locate_cfg_item(var, False) >=20 > + if item is None: >=20 > + raise ValueError("Cannot find variable '%s' !" % var) >=20 > + >=20 > + if item: >=20 > + if 'indx' in item: >=20 > + item =3D self.get_item_by_index(item['indx']) >=20 > + if attr =3D=3D 'offset': >=20 > + var =3D item['offset'] >=20 > + elif attr =3D=3D 'length': >=20 > + var =3D item['length'] >=20 > + elif attr =3D=3D 'value': >=20 > + var =3D self.get_cfg_item_value(item) >=20 > + else: >=20 > + raise ValueError("Unsupported variable attribute '%s' !"= % >=20 > + attr) >=20 > + return var >=20 > + >=20 > + def eval(self, expr): >=20 > + def _handler(pattern): >=20 > + if pattern.group(1): >=20 > + target =3D 1 >=20 > + else: >=20 > + target =3D 2 >=20 > + result =3D self.get_variable(pattern.group(target)) >=20 > + if result is None: >=20 > + raise ValueError('Unknown variable $(%s) !' % >=20 > + pattern.group(target)) >=20 > + return hex(result) >=20 > + >=20 > + expr_eval =3D ExpressionEval() >=20 > + if '$' in expr: >=20 > + # replace known variable first >=20 > + expr =3D re.sub(r'\$\(([_a-zA-Z][\w\.]*)\)|\$([_a-zA-Z][\w\.= ]*)', >=20 > + _handler, expr) >=20 > + return expr_eval.eval(expr, self.get_variable) >=20 > + >=20 > + def parse_macros(self, macro_def_str): >=20 > + # ['-DABC=3D1', '-D', 'CFG_DEBUG=3D1', '-D', 'CFG_OUTDIR=3DBuild= '] >=20 > + self._macro_dict =3D {} >=20 > + is_expression =3D False >=20 > + for macro in macro_def_str: >=20 > + if macro.startswith('-D'): >=20 > + is_expression =3D True >=20 > + if len(macro) > 2: >=20 > + macro =3D macro[2:] >=20 > + else: >=20 > + continue >=20 > + if is_expression: >=20 > + is_expression =3D False >=20 > + match =3D re.match("(\\w+)=3D(.+)", macro) >=20 > + if match: >=20 > + self._macro_dict[match.group(1)] =3D match.group(2) >=20 > + else: >=20 > + match =3D re.match("(\\w+)", macro) >=20 > + if match: >=20 > + self._macro_dict[match.group(1)] =3D '' >=20 > + if len(self._macro_dict) =3D=3D 0: >=20 > + error =3D 1 >=20 > + else: >=20 > + error =3D 0 >=20 > + if self._debug: >=20 > + print("INFO : Macro dictionary:") >=20 > + for each in self._macro_dict: >=20 > + print(" $(%s) =3D [ %s ]" >=20 > + % (each, self._macro_dict[each])) >=20 > + return error >=20 > + >=20 > + def get_cfg_list(self, page_id=3DNone): >=20 > + if page_id is None: >=20 > + # return full list >=20 > + return self._cfg_list >=20 > + else: >=20 > + # build a new list for items under a page ID >=20 > + cfgs =3D [i for i in self._cfg_list if i['cname'] and >=20 > + (i['page'] =3D=3D page_id)] >=20 > + return cfgs >=20 > + >=20 > + def get_cfg_page(self): >=20 > + return self._cfg_page >=20 > + >=20 > + def get_cfg_item_length(self, item): >=20 > + return item['length'] >=20 > + >=20 > + def get_cfg_item_value(self, item, array=3DFalse): >=20 > + value_str =3D item['value'] >=20 > + length =3D item['length'] >=20 > + return self.get_value(value_str, length, array) >=20 > + >=20 > + def format_value_to_str(self, value, bit_length, old_value=3D''): >=20 > + # value is always int >=20 > + length =3D (bit_length + 7) // 8 >=20 > + fmt =3D '' >=20 > + if old_value.startswith('0x'): >=20 > + fmt =3D '0x' >=20 > + elif old_value and (old_value[0] in ['"', "'", '{']): >=20 > + fmt =3D old_value[0] >=20 > + else: >=20 > + fmt =3D '' >=20 > + >=20 > + bvalue =3D value_to_bytearray(value, length) >=20 > + if fmt in ['"', "'"]: >=20 > + svalue =3D bvalue.rstrip(b'\x00').decode() >=20 > + value_str =3D fmt + svalue + fmt >=20 > + elif fmt =3D=3D "{": >=20 > + value_str =3D '{ ' + ', '.join(['0x%02x' % i for i in bvalue= ]) + ' }' >=20 > + elif fmt =3D=3D '0x': >=20 > + hex_len =3D length * 2 >=20 > + if len(old_value) =3D=3D hex_len + 2: >=20 > + fstr =3D '0x%%0%dX' % hex_len >=20 > + else: >=20 > + fstr =3D '0x%X' >=20 > + value_str =3D fstr % value >=20 > + else: >=20 > + if length <=3D 2: >=20 > + value_str =3D '%d' % value >=20 > + elif length <=3D 8: >=20 > + value_str =3D '0x%x' % value >=20 > + else: >=20 > + value_str =3D '{ ' + ', '.join(['0x%02x' % i for i in >=20 > + bvalue]) + ' }' >=20 > + return value_str >=20 > + >=20 > + def reformat_value_str(self, value_str, bit_length, old_value=3DNone= ): >=20 > + value =3D self.parse_value(value_str, bit_length, False) >=20 > + if old_value is None: >=20 > + old_value =3D value_str >=20 > + new_value =3D self.format_value_to_str(value, bit_length, old_va= lue) >=20 > + return new_value >=20 > + >=20 > + def get_value(self, value_str, bit_length, array=3DTrue): >=20 > + value_str =3D value_str.strip() >=20 > + if value_str[0] =3D=3D "'" and value_str[-1] =3D=3D "'" or \ >=20 > + value_str[0] =3D=3D '"' and value_str[-1] =3D=3D '"': >=20 > + value_str =3D value_str[1:-1] >=20 > + bvalue =3D bytearray(value_str.encode()) >=20 > + if len(bvalue) =3D=3D 0: >=20 > + bvalue =3D bytearray(b'\x00') >=20 > + if array: >=20 > + return bvalue >=20 > + else: >=20 > + return bytes_to_value(bvalue) >=20 > + else: >=20 > + if value_str[0] in '{': >=20 > + value_str =3D value_str[1:-1].strip() >=20 > + value =3D 0 >=20 > + for each in value_str.split(',')[::-1]: >=20 > + each =3D each.strip() >=20 > + value =3D (value << 8) | int(each, 0) >=20 > + if array: >=20 > + length =3D (bit_length + 7) // 8 >=20 > + return value_to_bytearray(value, length) >=20 > + else: >=20 > + return value >=20 > + >=20 > + def parse_value(self, value_str, bit_length, array=3DTrue): >=20 > + length =3D (bit_length + 7) // 8 >=20 > + if check_quote(value_str): >=20 > + value_str =3D bytes_to_bracket_str(value_str[1:-1].encode()) >=20 > + elif (',' in value_str) and (value_str[0] !=3D '{'): >=20 > + value_str =3D '{ %s }' % value_str >=20 > + if value_str[0] =3D=3D '{': >=20 > + result =3D expand_file_value(self._yaml_path, value_str) >=20 > + if len(result) =3D=3D 0: >=20 > + bin_list =3D value_str[1:-1].split(',') >=20 > + value =3D 0 >=20 > + bit_len =3D 0 >=20 > + unit_len =3D 1 >=20 > + for idx, element in enumerate(bin_list): >=20 > + each =3D element.strip() >=20 > + if len(each) =3D=3D 0: >=20 > + continue >=20 > + >=20 > + in_bit_field =3D False >=20 > + if each[0] in "'" + '"': >=20 > + each_value =3D bytearray(each[1:-1], 'utf-8') >=20 > + elif ':' in each: >=20 > + match =3D re.match("^(.+):(\\d+)([b|B|W|D|Q])$",= each) >=20 > + if match is None: >=20 > + raise SystemExit("Exception: Invald value\ >=20 > +list format '%s' !" % each) >=20 > + if match.group(1) =3D=3D '0' and match.group(2) = =3D=3D '0': >=20 > + unit_len =3D CGenYamlCfg.bits_width[match.gr= oup(3) >=20 > + ] // 8 >=20 > + cur_bit_len =3D int(match.group(2) >=20 > + ) * CGenYamlCfg.bits_width[ >=20 > + match.group(3)] >=20 > + value +=3D ((self.eval(match.group(1)) & ( >=20 > + 1 << cur_bit_len) - 1)) << bit_len >=20 > + bit_len +=3D cur_bit_len >=20 > + each_value =3D bytearray() >=20 > + if idx + 1 < len(bin_list): >=20 > + in_bit_field =3D True >=20 > + else: >=20 > + try: >=20 > + each_value =3D value_to_bytearray( >=20 > + self.eval(each.strip()), unit_len) >=20 > + except Exception: >=20 > + raise SystemExit("Exception: Value %d cannot= \ >=20 > +fit into %s bytes !" % (each, unit_len)) >=20 > + >=20 > + if not in_bit_field: >=20 > + if bit_len > 0: >=20 > + if bit_len % 8 !=3D 0: >=20 > + raise SystemExit("Exception: Invalid bit= \ >=20 > +field alignment '%s' !" % value_str) >=20 > + result.extend(value_to_bytes(value, bit_len = // 8)) >=20 > + value =3D 0 >=20 > + bit_len =3D 0 >=20 > + >=20 > + result.extend(each_value) >=20 > + >=20 > + elif check_quote(value_str): >=20 > + result =3D bytearray(value_str[1:-1], 'utf-8') # Excluding = quotes >=20 > + else: >=20 > + result =3D value_to_bytearray(self.eval(value_str), length) >=20 > + >=20 > + if len(result) < length: >=20 > + result.extend(b'\x00' * (length - len(result))) >=20 > + elif len(result) > length: >=20 > + raise SystemExit("Exception: Value '%s' is too big to fit \ >=20 > +into %d bytes !" % (value_str, length)) >=20 > + >=20 > + if array: >=20 > + return result >=20 > + else: >=20 > + return bytes_to_value(result) >=20 > + >=20 > + return result >=20 > + >=20 > + def get_cfg_item_options(self, item): >=20 > + tmp_list =3D [] >=20 > + if item['type'] =3D=3D "Combo": >=20 > + if item['option'] in CGenYamlCfg.builtin_option: >=20 > + for op_val, op_str in CGenYamlCfg.builtin_option[item['o= ption' >=20 > + ]]= : >=20 > + tmp_list.append((op_val, op_str)) >=20 > + else: >=20 > + opt_list =3D item['option'].split(',') >=20 > + for option in opt_list: >=20 > + option =3D option.strip() >=20 > + try: >=20 > + (op_val, op_str) =3D option.split(':') >=20 > + except Exception: >=20 > + raise SystemExit("Exception: Invalide \ >=20 > +option format '%s' !" % option) >=20 > + tmp_list.append((op_val, op_str)) >=20 > + return tmp_list >=20 > + >=20 > + def get_page_title(self, page_id, top=3DNone): >=20 > + if top is None: >=20 > + top =3D self.get_cfg_page()['root'] >=20 > + for node in top['child']: >=20 > + page_key =3D next(iter(node)) >=20 > + if page_id =3D=3D page_key: >=20 > + return node[page_key]['title'] >=20 > + else: >=20 > + result =3D self.get_page_title(page_id, node[page_key]) >=20 > + if result is not None: >=20 > + return result >=20 > + return None >=20 > + >=20 > + def print_pages(self, top=3DNone, level=3D0): >=20 > + if top is None: >=20 > + top =3D self.get_cfg_page()['root'] >=20 > + for node in top['child']: >=20 > + page_id =3D next(iter(node)) >=20 > + print('%s%s: %s' % (' ' * level, page_id, node[page_id]['ti= tle'])) >=20 > + level +=3D 1 >=20 > + self.print_pages(node[page_id], level) >=20 > + level -=3D 1 >=20 > + >=20 > + def get_item_by_index(self, index): >=20 > + return self._cfg_list[index] >=20 > + >=20 > + def get_item_by_path(self, path): >=20 > + node =3D self.locate_cfg_item(path) >=20 > + if node: >=20 > + return self.get_item_by_index(node['indx']) >=20 > + else: >=20 > + return None >=20 > + >=20 > + def locate_cfg_path(self, item): >=20 > + def _locate_cfg_path(root, level=3D0): >=20 > + # config structure >=20 > + if item is root: >=20 > + return path >=20 > + for key in root: >=20 > + if type(root[key]) is OrderedDict: >=20 > + level +=3D 1 >=20 > + path.append(key) >=20 > + ret =3D _locate_cfg_path(root[key], level) >=20 > + if ret: >=20 > + return ret >=20 > + path.pop() >=20 > + return None >=20 > + path =3D [] >=20 > + return _locate_cfg_path(self._cfg_tree) >=20 > + >=20 > + def locate_cfg_item(self, path, allow_exp=3DTrue): >=20 > + def _locate_cfg_item(root, path, level=3D0): >=20 > + if len(path) =3D=3D level: >=20 > + return root >=20 > + next_root =3D root.get(path[level], None) >=20 > + if next_root is None: >=20 > + if allow_exp: >=20 > + raise Exception('Not a valid CFG config option path:= %s' % >=20 > + '.'.join(path[:level+1])) >=20 > + else: >=20 > + return None >=20 > + return _locate_cfg_item(next_root, path, level + 1) >=20 > + >=20 > + path_nodes =3D path.split('.') >=20 > + return _locate_cfg_item(self._cfg_tree, path_nodes) >=20 > + >=20 > + def traverse_cfg_tree(self, handler, top=3DNone): >=20 > + def _traverse_cfg_tree(root, level=3D0): >=20 > + # config structure >=20 > + for key in root: >=20 > + if type(root[key]) is OrderedDict: >=20 > + level +=3D 1 >=20 > + handler(key, root[key], level) >=20 > + _traverse_cfg_tree(root[key], level) >=20 > + level -=3D 1 >=20 > + >=20 > + if top is None: >=20 > + top =3D self._cfg_tree >=20 > + _traverse_cfg_tree(top) >=20 > + >=20 > + def print_cfgs(self, root=3DNone, short=3DTrue, print_level=3D256): >=20 > + def _print_cfgs(name, cfgs, level): >=20 > + >=20 > + if 'indx' in cfgs: >=20 > + act_cfg =3D self.get_item_by_index(cfgs['indx']) >=20 > + else: >=20 > + offset =3D 0 >=20 > + length =3D 0 >=20 > + value =3D '' >=20 > + if CGenYamlCfg.STRUCT in cfgs: >=20 > + cfg =3D cfgs[CGenYamlCfg.STRUCT] >=20 > + offset =3D int(cfg['offset']) >=20 > + length =3D int(cfg['length']) >=20 > + if 'value' in cfg: >=20 > + value =3D cfg['value'] >=20 > + if length =3D=3D 0: >=20 > + return >=20 > + act_cfg =3D dict({'value': value, 'offset': offset, >=20 > + 'length': length}) >=20 > + value =3D act_cfg['value'] >=20 > + bit_len =3D act_cfg['length'] >=20 > + offset =3D (act_cfg['offset'] + 7) // 8 >=20 > + if value !=3D '': >=20 > + try: >=20 > + value =3D self.reformat_value_str(act_cfg['value'], >=20 > + act_cfg['length']) >=20 > + except Exception: >=20 > + value =3D act_cfg['value'] >=20 > + length =3D bit_len // 8 >=20 > + bit_len =3D '(%db)' % bit_len if bit_len % 8 else '' * 4 >=20 > + if level <=3D print_level: >=20 > + if short and len(value) > 40: >=20 > + value =3D '%s ... %s' % (value[:20], value[-20:]) >=20 > + print('%04X:%04X%-6s %s%s : %s' % (offset, length, bit_l= en, >=20 > + ' ' * level, name, v= alue)) >=20 > + >=20 > + self.traverse_cfg_tree(_print_cfgs) >=20 > + >=20 > + def build_var_dict(self): >=20 > + def _build_var_dict(name, cfgs, level): >=20 > + if level <=3D 2: >=20 > + if CGenYamlCfg.STRUCT in cfgs: >=20 > + struct_info =3D cfgs[CGenYamlCfg.STRUCT] >=20 > + self._var_dict['_LENGTH_%s_' % name] =3D struct_info= [ >=20 > + 'length'] // 8 >=20 > + self._var_dict['_OFFSET_%s_' % name] =3D struct_info= [ >=20 > + 'offset'] // 8 >=20 > + >=20 > + self._var_dict =3D {} >=20 > + self.traverse_cfg_tree(_build_var_dict) >=20 > + self._var_dict['_LENGTH_'] =3D self._cfg_tree[CGenYamlCfg.STRUCT= ][ >=20 > + 'length'] // 8 >=20 > + return 0 >=20 > + >=20 > + def add_cfg_page(self, child, parent, title=3D''): >=20 > + def _add_cfg_page(cfg_page, child, parent): >=20 > + key =3D next(iter(cfg_page)) >=20 > + if parent =3D=3D key: >=20 > + cfg_page[key]['child'].append({child: {'title': title, >=20 > + 'child': []}}) >=20 > + return True >=20 > + else: >=20 > + result =3D False >=20 > + for each in cfg_page[key]['child']: >=20 > + if _add_cfg_page(each, child, parent): >=20 > + result =3D True >=20 > + break >=20 > + return result >=20 > + >=20 > + return _add_cfg_page(self._cfg_page, child, parent) >=20 > + >=20 > + def set_cur_page(self, page_str): >=20 > + if not page_str: >=20 > + return >=20 > + >=20 > + if ',' in page_str: >=20 > + page_list =3D page_str.split(',') >=20 > + else: >=20 > + page_list =3D [page_str] >=20 > + for page_str in page_list: >=20 > + parts =3D page_str.split(':') >=20 > + if len(parts) in [1, 3]: >=20 > + page =3D parts[0].strip() >=20 > + if len(parts) =3D=3D 3: >=20 > + # it is a new page definition, add it into tree >=20 > + parent =3D parts[1] if parts[1] else 'root' >=20 > + parent =3D parent.strip() >=20 > + if parts[2][0] =3D=3D '"' and parts[2][-1] =3D=3D '"= ': >=20 > + parts[2] =3D parts[2][1:-1] >=20 > + >=20 > + if not self.add_cfg_page(page, parent, parts[2]): >=20 > + raise SystemExit("Error: Cannot find parent page= \ >=20 > +'%s'!" % parent) >=20 > + else: >=20 > + raise SystemExit("Error: Invalid page format '%s' !" >=20 > + % page_str) >=20 > + self._cur_page =3D page >=20 > + >=20 > + def extend_variable(self, line): >=20 > + # replace all variables >=20 > + if line =3D=3D '': >=20 > + return line >=20 > + loop =3D 2 >=20 > + while loop > 0: >=20 > + line_after =3D DefTemplate(line).safe_substitute(self._def_d= ict) >=20 > + if line =3D=3D line_after: >=20 > + break >=20 > + loop -=3D 1 >=20 > + line =3D line_after >=20 > + return line_after >=20 > + >=20 > + def reformat_number_per_type(self, itype, value): >=20 > + if check_quote(value) or value.startswith('{'): >=20 > + return value >=20 > + parts =3D itype.split(',') >=20 > + if len(parts) > 3 and parts[0] =3D=3D 'EditNum': >=20 > + num_fmt =3D parts[1].strip() >=20 > + else: >=20 > + num_fmt =3D '' >=20 > + if num_fmt =3D=3D 'HEX' and not value.startswith('0x'): >=20 > + value =3D '0x%X' % int(value, 10) >=20 > + elif num_fmt =3D=3D 'DEC' and value.startswith('0x'): >=20 > + value =3D '%d' % int(value, 16) >=20 > + return value >=20 > + >=20 > + def add_cfg_item(self, name, item, offset, path): >=20 > + >=20 > + self.set_cur_page(item.get('page', '')) >=20 > + >=20 > + if name[0] =3D=3D '$': >=20 > + # skip all virtual node >=20 > + return 0 >=20 > + >=20 > + if not set(item).issubset(CGenYamlCfg.keyword_set): >=20 > + for each in list(item): >=20 > + if each not in CGenYamlCfg.keyword_set: >=20 > + raise Exception("Invalid attribute '%s' for '%s'!" % >=20 > + (each, '.'.join(path))) >=20 > + >=20 > + length =3D item.get('length', 0) >=20 > + if type(length) is str: >=20 > + match =3D re.match("^(\\d+)([b|B|W|D|Q])([B|W|D|Q]?)\\s*$", = length) >=20 > + if match: >=20 > + unit_len =3D CGenYamlCfg.bits_width[match.group(2)] >=20 > + length =3D int(match.group(1), 10) * unit_len >=20 > + else: >=20 > + try: >=20 > + length =3D int(length, 0) * 8 >=20 > + except Exception: >=20 > + raise Exception("Invalid length field '%s' for '%s' = !" % >=20 > + (length, '.'.join(path))) >=20 > + >=20 > + if offset % 8 > 0: >=20 > + raise Exception("Invalid alignment for field '%s' fo= r \ >=20 > +'%s' !" % (name, '.'.join(path))) >=20 > + else: >=20 > + # define is length in bytes >=20 > + length =3D length * 8 >=20 > + >=20 > + if not name.isidentifier(): >=20 > + raise Exception("Invalid config name '%s' for '%s' !" % >=20 > + (name, '.'.join(path))) >=20 > + >=20 > + itype =3D str(item.get('type', 'Reserved')) >=20 > + value =3D str(item.get('value', '')) >=20 > + if value: >=20 > + if not (check_quote(value) or value.startswith('{')): >=20 > + if ',' in value: >=20 > + value =3D '{ %s }' % value >=20 > + else: >=20 > + value =3D self.reformat_number_per_type(itype, value= ) >=20 > + >=20 > + help =3D str(item.get('help', '')) >=20 > + if '\n' in help: >=20 > + help =3D ' '.join([i.strip() for i in help.splitlines()]) >=20 > + >=20 > + option =3D str(item.get('option', '')) >=20 > + if '\n' in option: >=20 > + option =3D ' '.join([i.strip() for i in option.splitlines()]= ) >=20 > + >=20 > + # extend variables for value and condition >=20 > + condition =3D str(item.get('condition', '')) >=20 > + if condition: >=20 > + condition =3D self.extend_variable(condition) >=20 > + value =3D self.extend_variable(value) >=20 > + >=20 > + order =3D str(item.get('order', '')) >=20 > + if order: >=20 > + if '.' in order: >=20 > + (major, minor) =3D order.split('.') >=20 > + order =3D int(major, 16) >=20 > + else: >=20 > + order =3D int(order, 16) >=20 > + else: >=20 > + order =3D offset >=20 > + >=20 > + cfg_item =3D dict() >=20 > + cfg_item['length'] =3D length >=20 > + cfg_item['offset'] =3D offset >=20 > + cfg_item['value'] =3D value >=20 > + cfg_item['type'] =3D itype >=20 > + cfg_item['cname'] =3D str(name) >=20 > + cfg_item['name'] =3D str(item.get('name', '')) >=20 > + cfg_item['help'] =3D help >=20 > + cfg_item['option'] =3D option >=20 > + cfg_item['page'] =3D self._cur_page >=20 > + cfg_item['order'] =3D order >=20 > + cfg_item['path'] =3D '.'.join(path) >=20 > + cfg_item['condition'] =3D condition >=20 > + if 'struct' in item: >=20 > + cfg_item['struct'] =3D item['struct'] >=20 > + self._cfg_list.append(cfg_item) >=20 > + >=20 > + item['indx'] =3D len(self._cfg_list) - 1 >=20 > + >=20 > + # remove used info for reducing pkl size >=20 > + item.pop('option', None) >=20 > + item.pop('condition', None) >=20 > + item.pop('help', None) >=20 > + item.pop('name', None) >=20 > + item.pop('page', None) >=20 > + >=20 > + return length >=20 > + >=20 > + def build_cfg_list(self, cfg_name=3D'', top=3DNone, path=3D[], >=20 > + info=3D{'offset': 0}): >=20 > + if top is None: >=20 > + top =3D self._cfg_tree >=20 > + info.clear() >=20 > + info =3D {'offset': 0} >=20 > + >=20 > + start =3D info['offset'] >=20 > + is_leaf =3D True >=20 > + for key in top: >=20 > + path.append(key) >=20 > + if type(top[key]) is OrderedDict: >=20 > + is_leaf =3D False >=20 > + self.build_cfg_list(key, top[key], path, info) >=20 > + path.pop() >=20 > + >=20 > + if is_leaf: >=20 > + length =3D self.add_cfg_item(cfg_name, top, info['offset'], = path) >=20 > + info['offset'] +=3D length >=20 > + elif cfg_name =3D=3D '' or (cfg_name and cfg_name[0] !=3D '$'): >=20 > + # check first element for struct >=20 > + first =3D next(iter(top)) >=20 > + struct_str =3D CGenYamlCfg.STRUCT >=20 > + if first !=3D struct_str: >=20 > + struct_node =3D OrderedDict({}) >=20 > + top[struct_str] =3D struct_node >=20 > + top.move_to_end(struct_str, False) >=20 > + else: >=20 > + struct_node =3D top[struct_str] >=20 > + struct_node['offset'] =3D start >=20 > + struct_node['length'] =3D info['offset'] - start >=20 > + if struct_node['length'] % 8 !=3D 0: >=20 > + raise SystemExit("Error: Bits length not aligned for %s = !" % >=20 > + str(path)) >=20 > + >=20 > + def get_field_value(self, top=3DNone): >=20 > + def _get_field_value(name, cfgs, level): >=20 > + if 'indx' in cfgs: >=20 > + act_cfg =3D self.get_item_by_index(cfgs['indx']) >=20 > + if act_cfg['length'] =3D=3D 0: >=20 > + return >=20 > + value =3D self.get_value(act_cfg['value'], act_cfg['leng= th'], >=20 > + False) >=20 > + set_bits_to_bytes(result, act_cfg['offset'] - >=20 > + struct_info['offset'], act_cfg['length= '], >=20 > + value) >=20 > + >=20 > + if top is None: >=20 > + top =3D self._cfg_tree >=20 > + struct_info =3D top[CGenYamlCfg.STRUCT] >=20 > + result =3D bytearray((struct_info['length'] + 7) // 8) >=20 > + self.traverse_cfg_tree(_get_field_value, top) >=20 > + return result >=20 > + >=20 > + def set_field_value(self, top, value_bytes, force=3DFalse): >=20 > + def _set_field_value(name, cfgs, level): >=20 > + if 'indx' not in cfgs: >=20 > + return >=20 > + act_cfg =3D self.get_item_by_index(cfgs['indx']) >=20 > + if force or act_cfg['value'] =3D=3D '': >=20 > + value =3D get_bits_from_bytes(full_bytes, >=20 > + act_cfg['offset'] - >=20 > + struct_info['offset'], >=20 > + act_cfg['length']) >=20 > + act_val =3D act_cfg['value'] >=20 > + if act_val =3D=3D '': >=20 > + act_val =3D '%d' % value >=20 > + act_val =3D self.reformat_number_per_type(act_cfg >=20 > + ['type'], >=20 > + act_val) >=20 > + act_cfg['value'] =3D self.format_value_to_str( >=20 > + value, act_cfg['length'], act_val) >=20 > + >=20 > + if 'indx' in top: >=20 > + # it is config option >=20 > + value =3D bytes_to_value(value_bytes) >=20 > + act_cfg =3D self.get_item_by_index(top['indx']) >=20 > + act_cfg['value'] =3D self.format_value_to_str( >=20 > + value, act_cfg['length'], act_cfg['value']) >=20 > + else: >=20 > + # it is structure >=20 > + struct_info =3D top[CGenYamlCfg.STRUCT] >=20 > + length =3D struct_info['length'] // 8 >=20 > + full_bytes =3D bytearray(value_bytes[:length]) >=20 > + if len(full_bytes) < length: >=20 > + full_bytes.extend(bytearray(length - len(value_bytes))) >=20 > + self.traverse_cfg_tree(_set_field_value, top) >=20 > + >=20 > + def update_def_value(self): >=20 > + def _update_def_value(name, cfgs, level): >=20 > + if 'indx' in cfgs: >=20 > + act_cfg =3D self.get_item_by_index(cfgs['indx']) >=20 > + if act_cfg['value'] !=3D '' and act_cfg['length'] > 0: >=20 > + try: >=20 > + act_cfg['value'] =3D self.reformat_value_str( >=20 > + act_cfg['value'], act_cfg['length']) >=20 > + except Exception: >=20 > + raise Exception("Invalid value expression '%s' \ >=20 > +for '%s' !" % (act_cfg['value'], act_cfg['path'])) >=20 > + else: >=20 > + if CGenYamlCfg.STRUCT in cfgs and 'value' in \ >=20 > + cfgs[CGenYamlCfg.STRUCT]: >=20 > + curr =3D cfgs[CGenYamlCfg.STRUCT] >=20 > + value_bytes =3D value_to_bytearray(self.eval(curr['v= alue']), >=20 > + (curr['length'] + 7= ) // 8) >=20 > + self.set_field_value(cfgs, value_bytes) >=20 > + >=20 > + self.traverse_cfg_tree(_update_def_value, self._cfg_tree) >=20 > + >=20 > + def evaluate_condition(self, item): >=20 > + expr =3D item['condition'] >=20 > + result =3D self.parse_value(expr, 1, False) >=20 > + return result >=20 > + >=20 > + def detect_fsp(self): >=20 > + cfg_segs =3D self.get_cfg_segment() >=20 > + if len(cfg_segs) =3D=3D 3: >=20 > + fsp =3D True >=20 > + for idx, seg in enumerate(cfg_segs): >=20 > + if not seg[0].endswith('UPD_%s' % 'TMS'[idx]): >=20 > + fsp =3D False >=20 > + break >=20 > + else: >=20 > + fsp =3D False >=20 > + if fsp: >=20 > + self.set_mode('FSP') >=20 > + return fsp >=20 > + >=20 > + def get_cfg_segment(self): >=20 > + def _get_cfg_segment(name, cfgs, level): >=20 > + if 'indx' not in cfgs: >=20 > + if name.startswith('$ACTION_'): >=20 > + if 'find' in cfgs: >=20 > + find[0] =3D cfgs['find'] >=20 > + else: >=20 > + if find[0]: >=20 > + act_cfg =3D self.get_item_by_index(cfgs['indx']) >=20 > + segments.append([find[0], act_cfg['offset'] // 8, 0]= ) >=20 > + find[0] =3D '' >=20 > + return >=20 > + >=20 > + find =3D [''] >=20 > + segments =3D [] >=20 > + self.traverse_cfg_tree(_get_cfg_segment, self._cfg_tree) >=20 > + cfg_len =3D self._cfg_tree[CGenYamlCfg.STRUCT]['length'] // 8 >=20 > + if len(segments) =3D=3D 0: >=20 > + segments.append(['', 0, cfg_len]) >=20 > + >=20 > + if segments[0][1] !=3D 0: >=20 > + raise Exception('"find" attribute should only appear ' >=20 > + 'at the beginning of a config segment !') >=20 > + segments.append(['', cfg_len, 0]) >=20 > + cfg_segs =3D [] >=20 > + for idx, each in enumerate(segments[:-1]): >=20 > + cfg_segs.append((each[0], each[1], >=20 > + segments[idx+1][1] - each[1])) >=20 > + return cfg_segs >=20 > + >=20 > + def get_bin_segment(self, bin_data): >=20 > + cfg_segs =3D self.get_cfg_segment() >=20 > + bin_segs =3D [] >=20 > + for seg in cfg_segs: >=20 > + key =3D seg[0].encode() >=20 > + if key =3D=3D 0: >=20 > + bin_segs.append([seg[0], 0, len(bin_data)]) >=20 > + break >=20 > + pos =3D bin_data.find(key) >=20 > + if pos >=3D 0: >=20 > + # ensure no other match for the key >=20 > + if bin_data[pos + len(seg[0]):].find(key) >=3D 0: >=20 > + print("Warning: Multiple matches for '%s' " >=20 > + "in binary, the 1st instance will be used !" >=20 > + % seg[0]) >=20 > + bin_segs.append([seg[0], pos, seg[2]]) >=20 > + else: >=20 > + raise Exception("Could not find '%s' in binary !" >=20 > + % seg[0]) >=20 > + return bin_segs >=20 > + >=20 > + def extract_cfg_from_bin(self, bin_data): >=20 > + # get cfg bin length >=20 > + cfg_bins =3D bytearray() >=20 > + bin_segs =3D self.get_bin_segment(bin_data) >=20 > + for each in bin_segs: >=20 > + cfg_bins.extend(bin_data[each[1]:each[1] + each[2]]) >=20 > + return cfg_bins >=20 > + >=20 > + def save_current_to_bin(self): >=20 > + cfg_bins =3D self.generate_binary_array() >=20 > + if self._old_bin is None: >=20 > + return cfg_bins >=20 > + >=20 > + bin_data =3D bytearray(self._old_bin) >=20 > + bin_segs =3D self.get_bin_segment(self._old_bin) >=20 > + cfg_off =3D 0 >=20 > + for each in bin_segs: >=20 > + length =3D each[2] >=20 > + bin_data[each[1]:each[1] + length] =3D cfg_bins[cfg_off: >=20 > + cfg_off >=20 > + + length] >=20 > + cfg_off +=3D length >=20 > + print('Patched the loaded binary successfully !') >=20 > + >=20 > + return bin_data >=20 > + >=20 > + def load_default_from_bin(self, bin_data): >=20 > + self._old_bin =3D bin_data >=20 > + cfg_bins =3D self.extract_cfg_from_bin(bin_data) >=20 > + self.set_field_value(self._cfg_tree, cfg_bins, True) >=20 > + return cfg_bins >=20 > + >=20 > + def generate_binary_array(self, path=3D''): >=20 > + if path =3D=3D '': >=20 > + top =3D None >=20 > + else: >=20 > + top =3D self.locate_cfg_item(path) >=20 > + if not top: >=20 > + raise Exception("Invalid configuration path '%s' !" >=20 > + % path) >=20 > + return self.get_field_value(top) >=20 > + >=20 > + def generate_binary(self, bin_file_name, path=3D''): >=20 > + bin_file =3D open(bin_file_name, "wb") >=20 > + bin_file.write(self.generate_binary_array(path)) >=20 > + bin_file.close() >=20 > + return 0 >=20 > + >=20 > + def write_delta_file(self, out_file, platform_id, out_lines): >=20 > + dlt_fd =3D open(out_file, "w") >=20 > + dlt_fd.write("%s\n" % get_copyright_header('dlt', True)) >=20 > + if platform_id is not None: >=20 > + dlt_fd.write('#\n') >=20 > + dlt_fd.write('# Delta configuration values for ' >=20 > + 'platform ID 0x%04X\n' >=20 > + % platform_id) >=20 > + dlt_fd.write('#\n\n') >=20 > + for line in out_lines: >=20 > + dlt_fd.write('%s\n' % line) >=20 > + dlt_fd.close() >=20 > + >=20 > + def override_default_value(self, dlt_file): >=20 > + error =3D 0 >=20 > + dlt_lines =3D CGenYamlCfg.expand_include_files(dlt_file) >=20 > + >=20 > + platform_id =3D None >=20 > + for line, file_path, line_num in dlt_lines: >=20 > + line =3D line.strip() >=20 > + if not line or line.startswith('#'): >=20 > + continue >=20 > + match =3D re.match("\\s*([\\w\\.]+)\\s*\\|\\s*(.+)", line) >=20 > + if not match: >=20 > + raise Exception("Unrecognized line '%s' " >=20 > + "(File:'%s' Line:%d) !" >=20 > + % (line, file_path, line_num + 1)) >=20 > + >=20 > + path =3D match.group(1) >=20 > + value_str =3D match.group(2) >=20 > + top =3D self.locate_cfg_item(path) >=20 > + if not top: >=20 > + raise Exception( >=20 > + "Invalid configuration '%s' (File:'%s' Line:%d) !" % >=20 > + (path, file_path, line_num + 1)) >=20 > + >=20 > + if 'indx' in top: >=20 > + act_cfg =3D self.get_item_by_index(top['indx']) >=20 > + bit_len =3D act_cfg['length'] >=20 > + else: >=20 > + struct_info =3D top[CGenYamlCfg.STRUCT] >=20 > + bit_len =3D struct_info['length'] >=20 > + >=20 > + value_bytes =3D self.parse_value(value_str, bit_len) >=20 > + self.set_field_value(top, value_bytes, True) >=20 > + >=20 > + if path =3D=3D 'PLATFORMID_CFG_DATA.PlatformId': >=20 > + platform_id =3D value_str >=20 > + >=20 > + if platform_id is None: >=20 > + raise Exception( >=20 > + "PLATFORMID_CFG_DATA.PlatformId is missing " >=20 > + "in file '%s' !" % >=20 > + (dlt_file)) >=20 > + >=20 > + return error >=20 > + >=20 > + def generate_delta_file_from_bin(self, delta_file, old_data, >=20 > + new_data, full=3DFalse): >=20 > + new_data =3D self.load_default_from_bin(new_data) >=20 > + lines =3D [] >=20 > + platform_id =3D None >=20 > + def_platform_id =3D 0 >=20 > + >=20 > + for item in self._cfg_list: >=20 > + if not full and (item['type'] in ['Reserved']): >=20 > + continue >=20 > + old_val =3D get_bits_from_bytes(old_data, item['offset'], >=20 > + item['length']) >=20 > + new_val =3D get_bits_from_bytes(new_data, item['offset'], >=20 > + item['length']) >=20 > + >=20 > + full_name =3D item['path'] >=20 > + if 'PLATFORMID_CFG_DATA.PlatformId' =3D=3D full_name: >=20 > + def_platform_id =3D old_val >=20 > + if new_val !=3D old_val or full: >=20 > + val_str =3D self.reformat_value_str(item['value'], >=20 > + item['length']) >=20 > + text =3D '%-40s | %s' % (full_name, val_str) >=20 > + lines.append(text) >=20 > + >=20 > + if self.get_mode() !=3D 'FSP': >=20 > + if platform_id is None or def_platform_id =3D=3D platfor= m_id: >=20 > + platform_id =3D def_platform_id >=20 > + print("WARNING: 'PlatformId' configuration is " >=20 > + "same as default %d!" % platform_id) >=20 > + >=20 > + lines.insert(0, '%-40s | %s\n\n' % >=20 > + ('PLATFORMID_CFG_DATA.PlatformId', >=20 > + '0x%04X' % platform_id)) >=20 > + else: >=20 > + platform_id =3D None >=20 > + >=20 > + self.write_delta_file(delta_file, platform_id, lines) >=20 > + return 0 >=20 > + >=20 > + def generate_delta_file(self, delta_file, bin_file, bin_file2, full= =3DFalse): >=20 > + fd =3D open(bin_file, 'rb') >=20 > + new_data =3D bytearray(fd.read()) >=20 > + fd.close() >=20 > + >=20 > + if bin_file2 =3D=3D '': >=20 > + old_data =3D self.generate_binary_array() >=20 > + else: >=20 > + old_data =3D new_data >=20 > + fd =3D open(bin_file2, 'rb') >=20 > + new_data =3D bytearray(fd.read()) >=20 > + fd.close() >=20 > + >=20 > + return self.generate_delta_file_from_bin(delta_file, >=20 > + old_data, new_data, ful= l) >=20 > + >=20 > + def prepare_marshal(self, is_save): >=20 > + if is_save: >=20 > + # Ordered dict is not marshallable, convert to list >=20 > + self._cfg_tree =3D CGenYamlCfg.deep_convert_dict(self._cfg_t= ree) >=20 > + else: >=20 > + # Revert it back >=20 > + self._cfg_tree =3D CGenYamlCfg.deep_convert_list(self._cfg_t= ree) >=20 > + >=20 > + def generate_yml_file(self, in_file, out_file): >=20 > + cfg_yaml =3D CFG_YAML() >=20 > + text =3D cfg_yaml.expand_yaml(in_file) >=20 > + yml_fd =3D open(out_file, "w") >=20 > + yml_fd.write(text) >=20 > + yml_fd.close() >=20 > + return 0 >=20 > + >=20 > + def write_cfg_header_file(self, hdr_file_name, tag_mode, >=20 > + tag_dict, struct_list): >=20 > + lines =3D [] >=20 > + lines.append('\n\n') >=20 > + if self.get_mode() =3D=3D 'FSP': >=20 > + lines.append('#include \n') >=20 > + >=20 > + tag_mode =3D tag_mode & 0x7F >=20 > + tag_list =3D sorted(list(tag_dict.items()), key=3Dlambda x: x[1]= ) >=20 > + for tagname, tagval in tag_list: >=20 > + if (tag_mode =3D=3D 0 and tagval >=3D 0x100) or \ >=20 > + (tag_mode =3D=3D 1 and tagval < 0x100): >=20 > + continue >=20 > + lines.append('#define %-30s 0x%03X\n' % ( >=20 > + 'CDATA_%s_TAG' % tagname[:-9], tagval)) >=20 > + lines.append('\n\n') >=20 > + >=20 > + name_dict =3D {} >=20 > + new_dict =3D {} >=20 > + for each in struct_list: >=20 > + if (tag_mode =3D=3D 0 and each['tag'] >=3D 0x100) or \ >=20 > + (tag_mode =3D=3D 1 and each['tag'] < 0x100): >=20 > + continue >=20 > + new_dict[each['name']] =3D (each['alias'], each['count']) >=20 > + if each['alias'] not in name_dict: >=20 > + name_dict[each['alias']] =3D 1 >=20 > + lines.extend(self.create_struct(each['alias'], >=20 > + each['node'], new_dict)) >=20 > + lines.append('#pragma pack()\n\n') >=20 > + >=20 > + self.write_header_file(lines, hdr_file_name) >=20 > + >=20 > + def write_header_file(self, txt_body, file_name, type=3D'h'): >=20 > + file_name_def =3D os.path.basename(file_name).replace('.', '_') >=20 > + file_name_def =3D re.sub('(.)([A-Z][a-z]+)', r'\1_\2', file_name= _def) >=20 > + file_name_def =3D re.sub('([a-z0-9])([A-Z])', r'\1_\2', >=20 > + file_name_def).upper() >=20 > + >=20 > + lines =3D [] >=20 > + lines.append("%s\n" % get_copyright_header(type)) >=20 > + lines.append("#ifndef __%s__\n" % file_name_def) >=20 > + lines.append("#define __%s__\n\n" % file_name_def) >=20 > + if type =3D=3D 'h': >=20 > + lines.append("#pragma pack(1)\n\n") >=20 > + lines.extend(txt_body) >=20 > + if type =3D=3D 'h': >=20 > + lines.append("#pragma pack()\n\n") >=20 > + lines.append("#endif\n") >=20 > + >=20 > + # Don't rewrite if the contents are the same >=20 > + create =3D True >=20 > + if os.path.exists(file_name): >=20 > + hdr_file =3D open(file_name, "r") >=20 > + org_txt =3D hdr_file.read() >=20 > + hdr_file.close() >=20 > + >=20 > + new_txt =3D ''.join(lines) >=20 > + if org_txt =3D=3D new_txt: >=20 > + create =3D False >=20 > + >=20 > + if create: >=20 > + hdr_file =3D open(file_name, "w") >=20 > + hdr_file.write(''.join(lines)) >=20 > + hdr_file.close() >=20 > + >=20 > + def generate_data_inc_file(self, dat_inc_file_name, bin_file=3DNone)= : >=20 > + # Put a prefix GUID before CFGDATA so that it can be located lat= er on >=20 > + prefix =3D b'\xa7\xbd\x7f\x73\x20\x1e\x46\xd6\ >=20 > +xbe\x8f\x64\x12\x05\x8d\x0a\xa8' >=20 > + if bin_file: >=20 > + fin =3D open(bin_file, 'rb') >=20 > + bin_dat =3D prefix + bytearray(fin.read()) >=20 > + fin.close() >=20 > + else: >=20 > + bin_dat =3D prefix + self.generate_binary_array() >=20 > + >=20 > + file_name =3D os.path.basename(dat_inc_file_name).upper() >=20 > + file_name =3D file_name.replace('.', '_') >=20 > + >=20 > + txt_lines =3D [] >=20 > + >=20 > + txt_lines.append("UINT8 mConfigDataBlob[%d] =3D {\n" % len(bin_= dat)) >=20 > + count =3D 0 >=20 > + line =3D [' '] >=20 > + for each in bin_dat: >=20 > + line.append('0x%02X, ' % each) >=20 > + count =3D count + 1 >=20 > + if (count & 0x0F) =3D=3D 0: >=20 > + line.append('\n') >=20 > + txt_lines.append(''.join(line)) >=20 > + line =3D [' '] >=20 > + if len(line) > 1: >=20 > + txt_lines.append(''.join(line) + '\n') >=20 > + >=20 > + txt_lines.append("};\n\n") >=20 > + self.write_header_file(txt_lines, dat_inc_file_name, 'inc') >=20 > + >=20 > + return 0 >=20 > + >=20 > + def get_struct_array_info(self, input): >=20 > + parts =3D input.split(':') >=20 > + if len(parts) > 1: >=20 > + var =3D parts[1] >=20 > + input =3D parts[0] >=20 > + else: >=20 > + var =3D '' >=20 > + array_str =3D input.split('[') >=20 > + name =3D array_str[0] >=20 > + if len(array_str) > 1: >=20 > + num_str =3D ''.join(c for c in array_str[-1] if c.isdigit()) >=20 > + num_str =3D '1000' if len(num_str) =3D=3D 0 else num_str >=20 > + array_num =3D int(num_str) >=20 > + else: >=20 > + array_num =3D 0 >=20 > + return name, array_num, var >=20 > + >=20 > + def process_multilines(self, string, max_char_length): >=20 > + multilines =3D '' >=20 > + string_length =3D len(string) >=20 > + current_string_start =3D 0 >=20 > + string_offset =3D 0 >=20 > + break_line_dict =3D [] >=20 > + if len(string) <=3D max_char_length: >=20 > + while (string_offset < string_length): >=20 > + if string_offset >=3D 1: >=20 > + if string[string_offset - 1] =3D=3D '\\' and string[ >=20 > + string_offset] =3D=3D 'n': >=20 > + break_line_dict.append(string_offset + 1) >=20 > + string_offset +=3D 1 >=20 > + if break_line_dict !=3D []: >=20 > + for each in break_line_dict: >=20 > + multilines +=3D " %s\n" % string[ >=20 > + current_string_start:each].lstrip() >=20 > + current_string_start =3D each >=20 > + if string_length - current_string_start > 0: >=20 > + multilines +=3D " %s\n" % string[ >=20 > + current_string_start:].lstrip() >=20 > + else: >=20 > + multilines =3D " %s\n" % string >=20 > + else: >=20 > + new_line_start =3D 0 >=20 > + new_line_count =3D 0 >=20 > + found_space_char =3D False >=20 > + while (string_offset < string_length): >=20 > + if string_offset >=3D 1: >=20 > + if new_line_count >=3D max_char_length - 1: >=20 > + if string[string_offset] =3D=3D ' ' and \ >=20 > + string_length - string_offset > 10: >=20 > + break_line_dict.append(new_line_start >=20 > + + new_line_count) >=20 > + new_line_start =3D new_line_start + new_line= _count >=20 > + new_line_count =3D 0 >=20 > + found_space_char =3D True >=20 > + elif string_offset =3D=3D string_length - 1 and = \ >=20 > + found_space_char is False: >=20 > + break_line_dict.append(0) >=20 > + if string[string_offset - 1] =3D=3D '\\' and string[ >=20 > + string_offset] =3D=3D 'n': >=20 > + break_line_dict.append(string_offset + 1) >=20 > + new_line_start =3D string_offset + 1 >=20 > + new_line_count =3D 0 >=20 > + string_offset +=3D 1 >=20 > + new_line_count +=3D 1 >=20 > + if break_line_dict !=3D []: >=20 > + break_line_dict.sort() >=20 > + for each in break_line_dict: >=20 > + if each > 0: >=20 > + multilines +=3D " %s\n" % string[ >=20 > + current_string_start:each].lstrip() >=20 > + current_string_start =3D each >=20 > + if string_length - current_string_start > 0: >=20 > + multilines +=3D " %s\n" % \ >=20 > + string[current_string_start:].lstrip() >=20 > + return multilines >=20 > + >=20 > + def create_field(self, item, name, length, offset, struct, >=20 > + bsf_name, help, option, bits_length=3DNone): >=20 > + pos_name =3D 28 >=20 > + name_line =3D '' >=20 > + # help_line =3D '' >=20 > + # option_line =3D '' >=20 > + >=20 > + if length =3D=3D 0 and name =3D=3D 'dummy': >=20 > + return '\n' >=20 > + >=20 > + if bits_length =3D=3D 0: >=20 > + return '\n' >=20 > + >=20 > + is_array =3D False >=20 > + if length in [1, 2, 4, 8]: >=20 > + type =3D "UINT%d" % (length * 8) >=20 > + else: >=20 > + is_array =3D True >=20 > + type =3D "UINT8" >=20 > + >=20 > + if item and item['value'].startswith('{'): >=20 > + type =3D "UINT8" >=20 > + is_array =3D True >=20 > + >=20 > + if struct !=3D '': >=20 > + struct_base =3D struct.rstrip('*') >=20 > + name =3D '*' * (len(struct) - len(struct_base)) + name >=20 > + struct =3D struct_base >=20 > + type =3D struct >=20 > + if struct in ['UINT8', 'UINT16', 'UINT32', 'UINT64']: >=20 > + is_array =3D True >=20 > + unit =3D int(type[4:]) // 8 >=20 > + length =3D length / unit >=20 > + else: >=20 > + is_array =3D False >=20 > + >=20 > + if is_array: >=20 > + name =3D name + '[%d]' % length >=20 > + >=20 > + if len(type) < pos_name: >=20 > + space1 =3D pos_name - len(type) >=20 > + else: >=20 > + space1 =3D 1 >=20 > + >=20 > + if bsf_name !=3D '': >=20 > + name_line =3D " %s\n" % bsf_name >=20 > + else: >=20 > + name_line =3D "N/A\n" >=20 > + >=20 > + # if help !=3D '': >=20 > + # help_line =3D self.process_multilines(help, 80) >=20 > + >=20 > + # if option !=3D '': >=20 > + # option_line =3D self.process_multilines(option, 80) >=20 > + >=20 > + if offset is None: >=20 > + offset_str =3D '????' >=20 > + else: >=20 > + offset_str =3D '0x%04X' % offset >=20 > + >=20 > + if bits_length is None: >=20 > + bits_length =3D '' >=20 > + else: >=20 > + bits_length =3D ' : %d' % bits_length >=20 > + >=20 > + # return "\n/** %s%s%s**/\n %s%s%s%s;\n" % (name_line, help_lin= e, >=20 > + # option_line, type, ' ' * space1, name, bits_length) >=20 > + return "\n /* Offset %s: %s */\n %s%s%s%s;\n" % ( >=20 > + offset_str, name_line.strip(), type, ' ' * space1, >=20 > + name, bits_length) >=20 > + >=20 > + def create_struct(self, cname, top, struct_dict): >=20 > + index =3D 0 >=20 > + last =3D '' >=20 > + lines =3D [] >=20 > + off_base =3D -1 >=20 > + >=20 > + if cname in struct_dict: >=20 > + if struct_dict[cname][2]: >=20 > + return [] >=20 > + lines.append('\ntypedef struct {\n') >=20 > + for field in top: >=20 > + if field[0] =3D=3D '$': >=20 > + continue >=20 > + >=20 > + index +=3D 1 >=20 > + >=20 > + t_item =3D top[field] >=20 > + if 'indx' not in t_item: >=20 > + if CGenYamlCfg.STRUCT not in top[field]: >=20 > + continue >=20 > + >=20 > + if struct_dict[field][1] =3D=3D 0: >=20 > + continue >=20 > + >=20 > + append =3D True >=20 > + struct_info =3D top[field][CGenYamlCfg.STRUCT] >=20 > + >=20 > + if 'struct' in struct_info: >=20 > + struct, array_num, var =3D self.get_struct_array_inf= o( >=20 > + struct_info['struct']) >=20 > + if array_num > 0: >=20 > + if last =3D=3D struct: >=20 > + append =3D False >=20 > + last =3D struct >=20 > + if var =3D=3D '': >=20 > + var =3D field >=20 > + >=20 > + field =3D CGenYamlCfg.format_struct_field_name( >=20 > + var, struct_dict[field][1]) >=20 > + else: >=20 > + struct =3D struct_dict[field][0] >=20 > + field =3D CGenYamlCfg.format_struct_field_name( >=20 > + field, struct_dict[field][1]) >=20 > + >=20 > + if append: >=20 > + offset =3D t_item['$STRUCT']['offset'] // 8 >=20 > + if off_base =3D=3D -1: >=20 > + off_base =3D offset >=20 > + line =3D self.create_field(None, field, 0, 0, struct= , >=20 > + '', '', '') >=20 > + lines.append(' %s' % line) >=20 > + last =3D struct >=20 > + continue >=20 > + >=20 > + item =3D self.get_item_by_index(t_item['indx']) >=20 > + if item['cname'] =3D=3D 'CfgHeader' and index =3D=3D 1 or \ >=20 > + (item['cname'] =3D=3D 'CondValue' and index =3D=3D 2): >=20 > + continue >=20 > + >=20 > + bit_length =3D None >=20 > + length =3D (item['length'] + 7) // 8 >=20 > + match =3D re.match("^(\\d+)([b|B|W|D|Q])([B|W|D|Q]?)", >=20 > + t_item['length']) >=20 > + if match and match.group(2) =3D=3D 'b': >=20 > + bit_length =3D int(match.group(1)) >=20 > + if match.group(3) !=3D '': >=20 > + length =3D CGenYamlCfg.bits_width[match.group(3)] //= 8 >=20 > + else: >=20 > + length =3D 4 >=20 > + offset =3D item['offset'] // 8 >=20 > + if off_base =3D=3D -1: >=20 > + off_base =3D offset >=20 > + struct =3D item.get('struct', '') >=20 > + name =3D field >=20 > + prompt =3D item['name'] >=20 > + help =3D item['help'] >=20 > + option =3D item['option'] >=20 > + line =3D self.create_field(item, name, length, offset, struc= t, >=20 > + prompt, help, option, bit_length) >=20 > + lines.append(' %s' % line) >=20 > + last =3D struct >=20 > + >=20 > + lines.append('\n} %s;\n\n' % cname) >=20 > + >=20 > + return lines >=20 > + >=20 > + def write_fsp_sig_header_file(self, hdr_file_name): >=20 > + hdr_fd =3D open(hdr_file_name, 'w') >=20 > + hdr_fd.write("%s\n" % get_copyright_header('h')) >=20 > + hdr_fd.write("#ifndef __FSPUPD_H__\n" >=20 > + "#define __FSPUPD_H__\n\n" >=20 > + "#include \n\n" >=20 > + "#pragma pack(1)\n\n") >=20 > + lines =3D [] >=20 > + for fsp_comp in 'TMS': >=20 > + top =3D self.locate_cfg_item('FSP%s_UPD' % fsp_comp) >=20 > + if not top: >=20 > + raise Exception('Could not find FSP UPD definition !') >=20 > + bins =3D self.get_field_value(top) >=20 > + lines.append("#define FSP%s_UPD_SIGNATURE" >=20 > + " 0x%016X /* '%s' */\n\n" >=20 > + % (fsp_comp, bytes_to_value(bins[:8]), >=20 > + bins[:8].decode())) >=20 > + hdr_fd.write(''.join(lines)) >=20 > + hdr_fd.write("#pragma pack()\n\n" >=20 > + "#endif\n") >=20 > + hdr_fd.close() >=20 > + >=20 > + def create_header_file(self, hdr_file_name, com_hdr_file_name=3D'', = path=3D''): >=20 > + >=20 > + def _build_header_struct(name, cfgs, level): >=20 > + if CGenYamlCfg.STRUCT in cfgs: >=20 > + if 'CfgHeader' in cfgs: >=20 > + # collect CFGDATA TAG IDs >=20 > + cfghdr =3D self.get_item_by_index(cfgs['CfgHeader'][= 'indx']) >=20 > + tag_val =3D array_str_to_value(cfghdr['value']) >> 2= 0 >=20 > + tag_dict[name] =3D tag_val >=20 > + if level =3D=3D 1: >=20 > + tag_curr[0] =3D tag_val >=20 > + struct_dict[name] =3D (level, tag_curr[0], cfgs) >=20 > + if path =3D=3D 'FSP_SIG': >=20 > + self.write_fsp_sig_header_file(hdr_file_name) >=20 > + return >=20 > + tag_curr =3D [0] >=20 > + tag_dict =3D {} >=20 > + struct_dict =3D {} >=20 > + >=20 > + if path =3D=3D '': >=20 > + top =3D None >=20 > + else: >=20 > + top =3D self.locate_cfg_item(path) >=20 > + if not top: >=20 > + raise Exception("Invalid configuration path '%s' !" % pa= th) >=20 > + _build_header_struct(path, top, 0) >=20 > + self.traverse_cfg_tree(_build_header_struct, top) >=20 > + >=20 > + if tag_curr[0] =3D=3D 0: >=20 > + hdr_mode =3D 2 >=20 > + else: >=20 > + hdr_mode =3D 1 >=20 > + >=20 > + if re.match('FSP[TMS]_UPD', path): >=20 > + hdr_mode |=3D 0x80 >=20 > + >=20 > + # filter out the items to be built for tags and structures >=20 > + struct_list =3D [] >=20 > + for each in struct_dict: >=20 > + match =3D False >=20 > + for check in CGenYamlCfg.exclude_struct: >=20 > + if re.match(check, each): >=20 > + match =3D True >=20 > + if each in tag_dict: >=20 > + if each not in CGenYamlCfg.include_tag: >=20 > + del tag_dict[each] >=20 > + break >=20 > + if not match: >=20 > + struct_list.append({'name': each, 'alias': '', 'count': = 0, >=20 > + 'level': struct_dict[each][0], >=20 > + 'tag': struct_dict[each][1], >=20 > + 'node': struct_dict[each][2]}) >=20 > + >=20 > + # sort by level so that the bottom level struct >=20 > + # will be build first to satisfy dependencies >=20 > + struct_list =3D sorted(struct_list, key=3Dlambda x: x['level'], >=20 > + reverse=3DTrue) >=20 > + >=20 > + # Convert XXX_[0-9]+ to XXX as an array hint >=20 > + for each in struct_list: >=20 > + cfgs =3D each['node'] >=20 > + if 'struct' in cfgs['$STRUCT']: >=20 > + each['alias'], array_num, var =3D self.get_struct_array_= info( >=20 > + cfgs['$STRUCT']['struct']) >=20 > + else: >=20 > + match =3D re.match('(\\w+)(_\\d+)', each['name']) >=20 > + if match: >=20 > + each['alias'] =3D match.group(1) >=20 > + else: >=20 > + each['alias'] =3D each['name'] >=20 > + >=20 > + # count items for array build >=20 > + for idx, each in enumerate(struct_list): >=20 > + if idx > 0: >=20 > + last_struct =3D struct_list[idx-1]['node']['$STRUCT'] >=20 > + curr_struct =3D each['node']['$STRUCT'] >=20 > + if struct_list[idx-1]['alias'] =3D=3D each['alias'] and = \ >=20 > + curr_struct['length'] =3D=3D last_struct['length'] an= d \ >=20 > + curr_struct['offset'] =3D=3D last_struct['offset'] + = \ >=20 > + last_struct['length']: >=20 > + for idx2 in range(idx-1, -1, -1): >=20 > + if struct_list[idx2]['count'] > 0: >=20 > + struct_list[idx2]['count'] +=3D 1 >=20 > + break >=20 > + continue >=20 > + each['count'] =3D 1 >=20 > + >=20 > + # generate common header >=20 > + if com_hdr_file_name: >=20 > + self.write_cfg_header_file(com_hdr_file_name, 0, tag_dict, >=20 > + struct_list) >=20 > + >=20 > + # generate platform header >=20 > + self.write_cfg_header_file(hdr_file_name, hdr_mode, tag_dict, >=20 > + struct_list) >=20 > + >=20 > + return 0 >=20 > + >=20 > + def load_yaml(self, cfg_file): >=20 > + cfg_yaml =3D CFG_YAML() >=20 > + self.initialize() >=20 > + self._cfg_tree =3D cfg_yaml.load_yaml(cfg_file) >=20 > + self._def_dict =3D cfg_yaml.def_dict >=20 > + self._yaml_path =3D os.path.dirname(cfg_file) >=20 > + self.build_cfg_list() >=20 > + self.build_var_dict() >=20 > + self.update_def_value() >=20 > + return 0 >=20 > + >=20 > + >=20 > +def usage(): >=20 > + print('\n'.join([ >=20 > + "GenYamlCfg Version 0.50", >=20 > + "Usage:", >=20 > + " GenYamlCfg GENINC BinFile IncOutFile " >=20 > + " [-D Macros]", >=20 > + >=20 > + " GenYamlCfg GENPKL YamlFile PklOutFile " >=20 > + " [-D Macros]", >=20 > + " GenYamlCfg GENBIN YamlFile[;DltFile] BinOutFile " >=20 > + " [-D Macros]", >=20 > + " GenYamlCfg GENDLT YamlFile[;BinFile] DltOutFile " >=20 > + " [-D Macros]", >=20 > + " GenYamlCfg GENYML YamlFile YamlOutFile" >=20 > + " [-D Macros]", >=20 > + " GenYamlCfg GENHDR YamlFile HdrOutFile " >=20 > + " [-D Macros]" >=20 > + ])) >=20 > + >=20 > + >=20 > +def main(): >=20 > + # Parse the options and args >=20 > + argc =3D len(sys.argv) >=20 > + if argc < 4: >=20 > + usage() >=20 > + return 1 >=20 > + >=20 > + gen_cfg_data =3D CGenYamlCfg() >=20 > + command =3D sys.argv[1].upper() >=20 > + out_file =3D sys.argv[3] >=20 > + if argc >=3D 5 and gen_cfg_data.parse_macros(sys.argv[4:]) !=3D 0: >=20 > + raise Exception("ERROR: Macro parsing failed !") >=20 > + >=20 > + file_list =3D sys.argv[2].split(';') >=20 > + if len(file_list) >=3D 2: >=20 > + yml_file =3D file_list[0] >=20 > + dlt_file =3D file_list[1] >=20 > + elif len(file_list) =3D=3D 1: >=20 > + yml_file =3D file_list[0] >=20 > + dlt_file =3D '' >=20 > + else: >=20 > + raise Exception("ERROR: Invalid parameter '%s' !" % sys.argv[2]) >=20 > + yml_scope =3D '' >=20 > + if '@' in yml_file: >=20 > + parts =3D yml_file.split('@') >=20 > + yml_file =3D parts[0] >=20 > + yml_scope =3D parts[1] >=20 > + >=20 > + if command =3D=3D "GENDLT" and yml_file.endswith('.dlt'): >=20 > + # It needs to expand an existing DLT file >=20 > + dlt_file =3D yml_file >=20 > + lines =3D gen_cfg_data.expand_include_files(dlt_file) >=20 > + write_lines(lines, out_file) >=20 > + return 0 >=20 > + >=20 > + if command =3D=3D "GENYML": >=20 > + if not yml_file.lower().endswith('.yaml'): >=20 > + raise Exception('Only YAML file is supported !') >=20 > + gen_cfg_data.generate_yml_file(yml_file, out_file) >=20 > + return 0 >=20 > + >=20 > + bin_file =3D '' >=20 > + if (yml_file.lower().endswith('.bin')) and (command =3D=3D "GENINC")= : >=20 > + # It is binary file >=20 > + bin_file =3D yml_file >=20 > + yml_file =3D '' >=20 > + >=20 > + if bin_file: >=20 > + gen_cfg_data.generate_data_inc_file(out_file, bin_file) >=20 > + return 0 >=20 > + >=20 > + cfg_bin_file =3D '' >=20 > + cfg_bin_file2 =3D '' >=20 > + if dlt_file: >=20 > + if command =3D=3D "GENDLT": >=20 > + cfg_bin_file =3D dlt_file >=20 > + dlt_file =3D '' >=20 > + if len(file_list) >=3D 3: >=20 > + cfg_bin_file2 =3D file_list[2] >=20 > + >=20 > + if yml_file.lower().endswith('.pkl'): >=20 > + with open(yml_file, "rb") as pkl_file: >=20 > + gen_cfg_data.__dict__ =3D marshal.load(pkl_file) >=20 > + gen_cfg_data.prepare_marshal(False) >=20 > + >=20 > + # Override macro definition again for Pickle file >=20 > + if argc >=3D 5: >=20 > + gen_cfg_data.parse_macros(sys.argv[4:]) >=20 > + else: >=20 > + gen_cfg_data.load_yaml(yml_file) >=20 > + if command =3D=3D 'GENPKL': >=20 > + gen_cfg_data.prepare_marshal(True) >=20 > + with open(out_file, "wb") as pkl_file: >=20 > + marshal.dump(gen_cfg_data.__dict__, pkl_file) >=20 > + json_file =3D os.path.splitext(out_file)[0] + '.json' >=20 > + fo =3D open(json_file, 'w') >=20 > + path_list =3D [] >=20 > + cfgs =3D {'_cfg_page': gen_cfg_data._cfg_page, >=20 > + '_cfg_list': gen_cfg_data._cfg_list, >=20 > + '_path_list': path_list} >=20 > + # optimize to reduce size >=20 > + path =3D None >=20 > + for each in cfgs['_cfg_list']: >=20 > + new_path =3D each['path'][:-len(each['cname'])-1] >=20 > + if path !=3D new_path: >=20 > + path =3D new_path >=20 > + each['path'] =3D path >=20 > + path_list.append(path) >=20 > + else: >=20 > + del each['path'] >=20 > + if each['order'] =3D=3D each['offset']: >=20 > + del each['order'] >=20 > + del each['offset'] >=20 > + >=20 > + # value is just used to indicate display type >=20 > + value =3D each['value'] >=20 > + if value.startswith('0x'): >=20 > + hex_len =3D ((each['length'] + 7) // 8) * 2 >=20 > + if len(value) =3D=3D hex_len: >=20 > + value =3D 'x%d' % hex_len >=20 > + else: >=20 > + value =3D 'x' >=20 > + each['value'] =3D value >=20 > + elif value and value[0] in ['"', "'", '{']: >=20 > + each['value'] =3D value[0] >=20 > + else: >=20 > + del each['value'] >=20 > + >=20 > + fo.write(repr(cfgs)) >=20 > + fo.close() >=20 > + return 0 >=20 > + >=20 > + if dlt_file: >=20 > + gen_cfg_data.override_default_value(dlt_file) >=20 > + >=20 > + gen_cfg_data.detect_fsp() >=20 > + >=20 > + if command =3D=3D "GENBIN": >=20 > + if len(file_list) =3D=3D 3: >=20 > + old_data =3D gen_cfg_data.generate_binary_array() >=20 > + fi =3D open(file_list[2], 'rb') >=20 > + new_data =3D bytearray(fi.read()) >=20 > + fi.close() >=20 > + if len(new_data) !=3D len(old_data): >=20 > + raise Exception("Binary file '%s' length does not match,= \ >=20 > +ignored !" % file_list[2]) >=20 > + else: >=20 > + gen_cfg_data.load_default_from_bin(new_data) >=20 > + gen_cfg_data.override_default_value(dlt_file) >=20 > + >=20 > + gen_cfg_data.generate_binary(out_file, yml_scope) >=20 > + >=20 > + elif command =3D=3D "GENDLT": >=20 > + full =3D True if 'FULL' in gen_cfg_data._macro_dict else False >=20 > + gen_cfg_data.generate_delta_file(out_file, cfg_bin_file, >=20 > + cfg_bin_file2, full) >=20 > + >=20 > + elif command =3D=3D "GENHDR": >=20 > + out_files =3D out_file.split(';') >=20 > + brd_out_file =3D out_files[0].strip() >=20 > + if len(out_files) > 1: >=20 > + com_out_file =3D out_files[1].strip() >=20 > + else: >=20 > + com_out_file =3D '' >=20 > + gen_cfg_data.create_header_file(brd_out_file, com_out_file, yml_= scope) >=20 > + >=20 > + elif command =3D=3D "GENINC": >=20 > + gen_cfg_data.generate_data_inc_file(out_file) >=20 > + >=20 > + elif command =3D=3D "DEBUG": >=20 > + gen_cfg_data.print_cfgs() >=20 > + >=20 > + else: >=20 > + raise Exception("Unsuported command '%s' !" % command) >=20 > + >=20 > + return 0 >=20 > + >=20 > + >=20 > +if __name__ =3D=3D '__main__': >=20 > + sys.exit(main()) >=20 > diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py > b/IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py > new file mode 100644 > index 0000000000..868b29d528 > --- /dev/null > +++ b/IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py > @@ -0,0 +1,324 @@ > +#!/usr/bin/env python >=20 > +# @ SingleSign.py >=20 > +# Single signing script >=20 > +# >=20 > +# Copyright (c) 2020, Intel Corporation. All rights reserved.
>=20 > +# SPDX-License-Identifier: BSD-2-Clause-Patent >=20 > +# >=20 > +## >=20 > + >=20 > +import os >=20 > +import sys >=20 > +import re >=20 > +import shutil >=20 > +import subprocess >=20 > + >=20 > +SIGNING_KEY =3D { >=20 > + # Key Id | Key File Name start | >=20 > + # > =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D >=20 > + # KEY_ID_MASTER is used for signing Slimboot Key Hash Manifest \ >=20 > + # container (KEYH Component) >=20 > + "KEY_ID_MASTER_RSA2048": "MasterTestKey_Priv_RSA2048.pem", >=20 > + "KEY_ID_MASTER_RSA3072": "MasterTestKey_Priv_RSA3072.pem", >=20 > + >=20 > + # KEY_ID_CFGDATA is used for signing external Config data blob) >=20 > + "KEY_ID_CFGDATA_RSA2048": "ConfigTestKey_Priv_RSA2048.pem", >=20 > + "KEY_ID_CFGDATA_RSA3072": "ConfigTestKey_Priv_RSA3072.pem", >=20 > + >=20 > + # KEY_ID_FIRMWAREUPDATE is used for signing capsule firmware update > image) >=20 > + "KEY_ID_FIRMWAREUPDATE_RSA2048": > "FirmwareUpdateTestKey_Priv_RSA2048.pem", >=20 > + "KEY_ID_FIRMWAREUPDATE_RSA3072": > "FirmwareUpdateTestKey_Priv_RSA3072.pem", >=20 > + >=20 > + # KEY_ID_CONTAINER is used for signing container header with mono > signature >=20 > + "KEY_ID_CONTAINER_RSA2048": "ContainerTestKey_Priv_RSA2048.pem", >=20 > + "KEY_ID_CONTAINER_RSA3072": "ContainerTestKey_Priv_RSA3072.pem", >=20 > + >=20 > + # CONTAINER_COMP1_KEY_ID is used for signing container components >=20 > + "KEY_ID_CONTAINER_COMP_RSA2048": > "ContainerCompTestKey_Priv_RSA2048.pem", >=20 > + "KEY_ID_CONTAINER_COMP_RSA3072": > "ContainerCompTestKey_Priv_RSA3072.pem", >=20 > + >=20 > + # KEY_ID_OS1_PUBLIC, KEY_ID_OS2_PUBLIC is used for referencing \ >=20 > + # Boot OS public keys >=20 > + "KEY_ID_OS1_PUBLIC_RSA2048": "OS1_TestKey_Pub_RSA2048.pem", >=20 > + "KEY_ID_OS1_PUBLIC_RSA3072": "OS1_TestKey_Pub_RSA3072.pem", >=20 > + >=20 > + "KEY_ID_OS2_PUBLIC_RSA2048": "OS2_TestKey_Pub_RSA2048.pem", >=20 > + "KEY_ID_OS2_PUBLIC_RSA3072": "OS2_TestKey_Pub_RSA3072.pem", >=20 > + >=20 > + } >=20 > + >=20 > +MESSAGE_SBL_KEY_DIR =3D """!!! PRE-REQUISITE: Path to SBL_KEY_DIR has. >=20 > +to be set with SBL KEYS DIRECTORY !!! \n!!! Generate keys. >=20 > +using GenerateKeys.py available in BootloaderCorePkg/Tools. >=20 > +directory !!! \n !!! Run $python. >=20 > +BootloaderCorePkg/Tools/GenerateKeys.py -k $PATH_TO_SBL_KEY_DIR !!!\n >=20 > +!!! Set SBL_KEY_DIR environ with path to SBL KEYS DIR !!!\n" >=20 > +!!! Windows $set SBL_KEY_DIR=3D$PATH_TO_SBL_KEY_DIR !!!\n >=20 > +!!! Linux $export SBL_KEY_DIR=3D$PATH_TO_SBL_KEY_DIR !!!\n""" >=20 > + >=20 > + >=20 > +def get_openssl_path(): >=20 > + if os.name =3D=3D 'nt': >=20 > + if 'OPENSSL_PATH' not in os.environ: >=20 > + openssl_dir =3D "C:\\Openssl\\bin\\" >=20 > + if os.path.exists(openssl_dir): >=20 > + os.environ['OPENSSL_PATH'] =3D openssl_dir >=20 > + else: >=20 > + os.environ['OPENSSL_PATH'] =3D "C:\\Openssl\\" >=20 > + if 'OPENSSL_CONF' not in os.environ: >=20 > + openssl_cfg =3D "C:\\Openssl\\openssl.cfg" >=20 > + if os.path.exists(openssl_cfg): >=20 > + os.environ['OPENSSL_CONF'] =3D openssl_cfg >=20 > + openssl =3D os.path.join( >=20 > + os.environ.get('OPENSSL_PATH', ''), >=20 > + 'openssl.exe') >=20 > + else: >=20 > + # Get openssl path for Linux cases >=20 > + openssl =3D shutil.which('openssl') >=20 > + >=20 > + return openssl >=20 > + >=20 > + >=20 > +def run_process(arg_list, print_cmd=3DFalse, capture_out=3DFalse): >=20 > + sys.stdout.flush() >=20 > + if print_cmd: >=20 > + print(' '.join(arg_list)) >=20 > + >=20 > + exc =3D None >=20 > + result =3D 0 >=20 > + output =3D '' >=20 > + try: >=20 > + if capture_out: >=20 > + output =3D subprocess.check_output(arg_list).decode() >=20 > + else: >=20 > + result =3D subprocess.call(arg_list) >=20 > + except Exception as ex: >=20 > + result =3D 1 >=20 > + exc =3D ex >=20 > + >=20 > + if result: >=20 > + if not print_cmd: >=20 > + print('Error in running process:\n %s' % ' '.join(arg_list)= ) >=20 > + if exc is None: >=20 > + sys.exit(1) >=20 > + else: >=20 > + raise exc >=20 > + >=20 > + return output >=20 > + >=20 > + >=20 > +def check_file_pem_format(priv_key): >=20 > + # Check for file .pem format >=20 > + key_name =3D os.path.basename(priv_key) >=20 > + if os.path.splitext(key_name)[1] =3D=3D ".pem": >=20 > + return True >=20 > + else: >=20 > + return False >=20 > + >=20 > + >=20 > +def get_key_id(priv_key): >=20 > + # Extract base name if path is provided. >=20 > + key_name =3D os.path.basename(priv_key) >=20 > + # Check for KEY_ID in key naming. >=20 > + if key_name.startswith('KEY_ID'): >=20 > + return key_name >=20 > + else: >=20 > + return None >=20 > + >=20 > + >=20 > +def get_sbl_key_dir(): >=20 > + # Check Key store setting SBL_KEY_DIR path >=20 > + if 'SBL_KEY_DIR' not in os.environ: >=20 > + exception_string =3D "ERROR: SBL_KEY_DIR is not defined." \ >=20 > + " Set SBL_KEY_DIR with SBL Keys directory!!\n" >=20 > + raise Exception(exception_string + MESSAGE_SBL_KEY_DIR) >=20 > + >=20 > + sbl_key_dir =3D os.environ.get('SBL_KEY_DIR') >=20 > + if not os.path.exists(sbl_key_dir): >=20 > + exception_string =3D "ERROR:SBL_KEY_DIR set " + sbl_key_dir \ >=20 > + + " is not valid." \ >=20 > + " Set the correct SBL_KEY_DIR path !!\n" \ >=20 > + + MESSAGE_SBL_KEY_DIR >=20 > + raise Exception(exception_string) >=20 > + else: >=20 > + return sbl_key_dir >=20 > + >=20 > + >=20 > +def get_key_from_store(in_key): >=20 > + >=20 > + # Check in_key is path to key >=20 > + if os.path.exists(in_key): >=20 > + return in_key >=20 > + >=20 > + # Get Slimboot key dir path >=20 > + sbl_key_dir =3D get_sbl_key_dir() >=20 > + >=20 > + # Extract if in_key is key_id >=20 > + priv_key =3D get_key_id(in_key) >=20 > + if priv_key is not None: >=20 > + if (priv_key in SIGNING_KEY): >=20 > + # Generate key file name from key id >=20 > + priv_key_file =3D SIGNING_KEY[priv_key] >=20 > + else: >=20 > + exception_string =3D "KEY_ID" + priv_key + "is not found " \ >=20 > + "is not found in supported KEY IDs!!" >=20 > + raise Exception(exception_string) >=20 > + elif check_file_pem_format(in_key): >=20 > + # check if file name is provided in pem format >=20 > + priv_key_file =3D in_key >=20 > + else: >=20 > + priv_key_file =3D None >=20 > + raise Exception('key provided %s is not valid!' % in_key) >=20 > + >=20 > + # Create a file path >=20 > + # Join Key Dir and priv_key_file >=20 > + try: >=20 > + priv_key =3D os.path.join(sbl_key_dir, priv_key_file) >=20 > + except Exception: >=20 > + raise Exception('priv_key is not found %s!' % priv_key) >=20 > + >=20 > + # Check for priv_key construted based on KEY ID exists in specified = path >=20 > + if not os.path.isfile(priv_key): >=20 > + exception_string =3D "!!! ERROR: Key file corresponding to" \ >=20 > + + in_key + "do not exist in Sbl key " \ >=20 > + "directory at" + sbl_key_dir + "!!! \n" \ >=20 > + + MESSAGE_SBL_KEY_DIR >=20 > + raise Exception(exception_string) >=20 > + >=20 > + return priv_key >=20 > + >=20 > +# >=20 > +# Sign an file using openssl >=20 > +# >=20 > +# priv_key [Input] Key Id or Path to Private key >=20 > +# hash_type [Input] Signing hash >=20 > +# sign_scheme[Input] Sign/padding scheme >=20 > +# in_file [Input] Input file to be signed >=20 > +# out_file [Input/Output] Signed data file >=20 > +# >=20 > + >=20 > + >=20 > +def single_sign_file(priv_key, hash_type, sign_scheme, in_file, out_file= ): >=20 > + >=20 > + _hash_type_string =3D { >=20 > + "SHA2_256": 'sha256', >=20 > + "SHA2_384": 'sha384', >=20 > + "SHA2_512": 'sha512', >=20 > + } >=20 > + >=20 > + _hash_digest_Size =3D { >=20 > + # Hash_string : Hash_Size >=20 > + "SHA2_256": 32, >=20 > + "SHA2_384": 48, >=20 > + "SHA2_512": 64, >=20 > + "SM3_256": 32, >=20 > + } >=20 > + >=20 > + _sign_scheme_string =3D { >=20 > + "RSA_PKCS1": 'pkcs1', >=20 > + "RSA_PSS": 'pss', >=20 > + } >=20 > + >=20 > + priv_key =3D get_key_from_store(priv_key) >=20 > + >=20 > + # Temporary files to store hash generated >=20 > + hash_file_tmp =3D out_file+'.hash.tmp' >=20 > + hash_file =3D out_file+'.hash' >=20 > + >=20 > + # Generate hash using openssl dgst in hex format >=20 > + cmdargs =3D [get_openssl_path(), >=20 > + 'dgst', >=20 > + '-'+'%s' % _hash_type_string[hash_type], >=20 > + '-out', '%s' % hash_file_tmp, '%s' % in_file] >=20 > + run_process(cmdargs) >=20 > + >=20 > + # Extract hash form dgst command output and convert to ascii >=20 > + with open(hash_file_tmp, 'r') as fin: >=20 > + hashdata =3D fin.read() >=20 > + fin.close() >=20 > + >=20 > + try: >=20 > + hashdata =3D hashdata.rsplit('=3D', 1)[1].strip() >=20 > + except Exception: >=20 > + raise Exception('Hash Data not found for signing!') >=20 > + >=20 > + if len(hashdata) !=3D (_hash_digest_Size[hash_type] * 2): >=20 > + raise Exception('Hash Data size do match with for hash type!') >=20 > + >=20 > + hashdata_bytes =3D bytearray.fromhex(hashdata) >=20 > + open(hash_file, 'wb').write(hashdata_bytes) >=20 > + >=20 > + print("Key used for Singing %s !!" % priv_key) >=20 > + >=20 > + # sign using Openssl pkeyutl >=20 > + cmdargs =3D [get_openssl_path(), >=20 > + 'pkeyutl', '-sign', '-in', '%s' % hash_file, >=20 > + '-inkey', '%s' % priv_key, '-out', >=20 > + '%s' % out_file, '-pkeyopt', >=20 > + 'digest:%s' % _hash_type_string[hash_type], >=20 > + '-pkeyopt', 'rsa_padding_mode:%s' % >=20 > + _sign_scheme_string[sign_scheme]] >=20 > + >=20 > + run_process(cmdargs) >=20 > + >=20 > + return >=20 > + >=20 > +# >=20 > +# Extract public key using openssl >=20 > +# >=20 > +# in_key [Input] Private key or public key in pem format >=20 > +# pub_key_file [Input/Output] Public Key to a file >=20 > +# >=20 > +# return keydata (mod, exp) in bin format >=20 > +# >=20 > + >=20 > + >=20 > +def single_sign_gen_pub_key(in_key, pub_key_file=3DNone): >=20 > + >=20 > + in_key =3D get_key_from_store(in_key) >=20 > + >=20 > + # Expect key to be in PEM format >=20 > + is_prv_key =3D False >=20 > + cmdline =3D [get_openssl_path(), 'rsa', '-pubout', '-text', '-noout'= , >=20 > + '-in', '%s' % in_key] >=20 > + # Check if it is public key or private key >=20 > + text =3D open(in_key, 'r').read() >=20 > + if '-BEGIN RSA PRIVATE KEY-' in text: >=20 > + is_prv_key =3D True >=20 > + elif '-BEGIN PUBLIC KEY-' in text: >=20 > + cmdline.extend(['-pubin']) >=20 > + else: >=20 > + raise Exception('Unknown key format "%s" !' % in_key) >=20 > + >=20 > + if pub_key_file: >=20 > + cmdline.extend(['-out', '%s' % pub_key_file]) >=20 > + capture =3D False >=20 > + else: >=20 > + capture =3D True >=20 > + >=20 > + output =3D run_process(cmdline, capture_out=3Dcapture) >=20 > + if not capture: >=20 > + output =3D text =3D open(pub_key_file, 'r').read() >=20 > + data =3D output.replace('\r', '') >=20 > + data =3D data.replace('\n', '') >=20 > + data =3D data.replace(' ', '') >=20 > + >=20 > + # Extract the modulus >=20 > + if is_prv_key: >=20 > + match =3D re.search('modulus(.*)publicExponent:\\s+(\\d+)\\s+', = data) >=20 > + else: >=20 > + match =3D re.search('Modulus(?:.*?):(.*)Exponent:\\s+(\\d+)\\s+'= , data) >=20 > + if not match: >=20 > + raise Exception('Public key not found!') >=20 > + modulus =3D match.group(1).replace(':', '') >=20 > + exponent =3D int(match.group(2)) >=20 > + >=20 > + mod =3D bytearray.fromhex(modulus) >=20 > + # Remove the '00' from the front if the MSB is 1 >=20 > + if mod[0] =3D=3D 0 and (mod[1] & 0x80): >=20 > + mod =3D mod[1:] >=20 > + exp =3D bytearray.fromhex('{:08x}'.format(exponent)) >=20 > + >=20 > + keydata =3D mod + exp >=20 > + >=20 > + return keydata >=20 > diff --git a/IntelFsp2Pkg/Tools/UserManuals/ConfigEditorUserManual.md > b/IntelFsp2Pkg/Tools/UserManuals/ConfigEditorUserManual.md > new file mode 100644 > index 0000000000..d196426608 > --- /dev/null > +++ b/IntelFsp2Pkg/Tools/UserManuals/ConfigEditorUserManual.md > @@ -0,0 +1,46 @@ > +#Name >=20 > +**ConfigEditor.py** is a python script with a GUI interface that can sup= port > changing configuration settings directly from the interface without havin= g to > modify the source. >=20 > + >=20 > +#Description >=20 > +This is a GUI interface that can be used by users who would like to chan= ge > configuration settings directly from the interface without having to modi= fy the > SBL source. >=20 > +This tool depends on Python GUI tool kit Tkinter. It runs on both Window= s and > Linux. >=20 > +The user needs to load the YAML file along with DLT file for a specific = board > into the ConfigEditor, change the desired configuration values. Finally, = generate > a new configuration delta file or a config binary blob for the newly chan= ged > values to take effect. These will be the inputs to the merge tool or the = stitch tool > so that new config changes can be merged and stitched into the final > configuration blob. >=20 > + >=20 > + >=20 > +It supports the following options: >=20 > + >=20 > +## 1. Open Config YAML file >=20 > +This option loads the YAML file for a FSP UPD into the ConfigEditor to c= hange > the desired configuration values. >=20 > + >=20 > +#####Example: >=20 > +``` >=20 > +![Example ConfigEditor > 1](https://slimbootloader.github.io/_images/CfgEditOpen.png) >=20 > + >=20 > +![Example ConfigEditor > 2](https://slimbootloader.github.io/_images/CfgEditDefYaml.png) >=20 > +``` >=20 > +## 2. Open Config BSF File >=20 > +This option loads the BSF file into the ConfigEditor to change the desir= ed > configuration values. >=20 > +BSF file can be loaded directly without loading any YAML file. This is a= n > alternative route for >=20 > +backward compatibility (projects without YAML capability). >=20 > + >=20 > +## 3. Show Binary Configuration >=20 > +This option loads configuration data from FD file and displays it in the > ConfigEditor. >=20 > + >=20 > +## 4. Save Config Data to Binary >=20 > +This option generates a config binary blob for the newly changed values = to take > effect. >=20 > + >=20 > +## 5. Load Config Data from Binary >=20 > +This option reloads changed configuration from BIN file into the ConfigE= ditor. >=20 > + >=20 > +## 6. Load Config Changes from Delta File >=20 > +This option loads the changed configuration values from Delta file into = the > ConfigEditor. >=20 > + >=20 > +## 7. Save Config Changes to Delta File >=20 > +This option generates a new configuration delta file for the newly chang= ed > values to take effect. >=20 > + >=20 > +## 8. Save Full Config Data to Delta File >=20 > +This option saves all the changed configuration values into a Delta file= . >=20 > + >=20 > +## Running Configuration Editor: >=20 > + >=20 > + **python ConfigEditor.py** >=20 > -- > 2.28.0.windows.1