2025-07-01
This commit is contained in:
@@ -0,0 +1,674 @@
|
||||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU General Public License is a free, copyleft license for
|
||||
software and other kinds of works.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
the GNU General Public License is intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users. We, the Free Software Foundation, use the
|
||||
GNU General Public License for most of our software; it applies also to
|
||||
any other work released this way by its authors. You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to prevent others from denying you
|
||||
these rights or asking you to surrender the rights. Therefore, you have
|
||||
certain responsibilities if you distribute copies of the software, or if
|
||||
you modify it: responsibilities to respect the freedom of others.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must pass on to the recipients the same
|
||||
freedoms that you received. You must make sure that they, too, receive
|
||||
or can get the source code. And you must show them these terms so they
|
||||
know their rights.
|
||||
|
||||
Developers that use the GNU GPL protect your rights with two steps:
|
||||
(1) assert copyright on the software, and (2) offer you this License
|
||||
giving you legal permission to copy, distribute and/or modify it.
|
||||
|
||||
For the developers' and authors' protection, the GPL clearly explains
|
||||
that there is no warranty for this free software. For both users' and
|
||||
authors' sake, the GPL requires that modified versions be marked as
|
||||
changed, so that their problems will not be attributed erroneously to
|
||||
authors of previous versions.
|
||||
|
||||
Some devices are designed to deny users access to install or run
|
||||
modified versions of the software inside them, although the manufacturer
|
||||
can do so. This is fundamentally incompatible with the aim of
|
||||
protecting users' freedom to change the software. The systematic
|
||||
pattern of such abuse occurs in the area of products for individuals to
|
||||
use, which is precisely where it is most unacceptable. Therefore, we
|
||||
have designed this version of the GPL to prohibit the practice for those
|
||||
products. If such problems arise substantially in other domains, we
|
||||
stand ready to extend this provision to those domains in future versions
|
||||
of the GPL, as needed to protect the freedom of users.
|
||||
|
||||
Finally, every program is threatened constantly by software patents.
|
||||
States should not allow patents to restrict development and use of
|
||||
software on general-purpose computers, but in those that do, we wish to
|
||||
avoid the special danger that patents applied to a free program could
|
||||
make it effectively proprietary. To prevent this, the GPL assures that
|
||||
patents cannot be used to render the program non-free.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Use with the GNU Affero General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU Affero General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the special requirements of the GNU Affero General Public License,
|
||||
section 13, concerning interaction through a network will apply to the
|
||||
combination as such.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
{one line to give the program's name and a brief idea of what it does.}
|
||||
Copyright (C) {year} {name of author}
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program does terminal interaction, make it output a short
|
||||
notice like this when it starts in an interactive mode:
|
||||
|
||||
{project} Copyright (C) {year} {fullname}
|
||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, your program's commands
|
||||
might be different; for a GUI interface, you would use an "about box".
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU GPL, see
|
||||
<http://www.gnu.org/licenses/>.
|
||||
|
||||
The GNU General Public License does not permit incorporating your program
|
||||
into proprietary programs. If your program is a subroutine library, you
|
||||
may consider it more useful to permit linking proprietary applications with
|
||||
the library. If this is what you want to do, use the GNU Lesser General
|
||||
Public License instead of this License. But first, please read
|
||||
<http://www.gnu.org/philosophy/why-not-lgpl.html>.
|
||||
@@ -0,0 +1,38 @@
|
||||
For the full instructions on how to use, go to:
|
||||
https://help.poliigon.com/en/articles/6342599-poliigon-blender-addon
|
||||
|
||||
For the latest changelog and primary download, go to:
|
||||
https://poliigon.com/blender
|
||||
|
||||
For Blender 4.2 and later:
|
||||
|
||||
Step 1: Download the zip file to anywhere on your hard drive, do not unzip it.
|
||||
If you are reading these instructions, you likely need to re-download
|
||||
after turning off "auto-open files" on your browser!
|
||||
Step 2: Open a new instance of Blender
|
||||
Step 3: Drag and drop the zip file anywhere into the Blender window to get a popup
|
||||
Step 4: Make sure "Enable on install" (and "overwrite", if this is a re-install)
|
||||
are ticked, then press Ok
|
||||
Step 5: Close user preferences
|
||||
Step 6: Go to the 3D editor, press "n" or otherwise open the toolshelf,
|
||||
click on the "Poliigon" tab
|
||||
|
||||
For Blender 4.1 and older, or if you have issues with the drag-install above:
|
||||
|
||||
Step 1: Download the zip file to anywhere on your hard drive, do not unzip it.
|
||||
If you are reading these instructions, you likely need to re-download
|
||||
after turning off "auto-open files" on your browser!
|
||||
Step 2: Open a new instance of Blender
|
||||
Step 3: Go to File>User Preferences then Addons
|
||||
Step 4: At the bottom, click Install from File
|
||||
Step 5: Locate the zipped download and double click it to install.
|
||||
Step 6: Search for "poliigon" if the add-on does not immediately appear, and
|
||||
click the tick box to enable the add-on
|
||||
Step 7: Press "save preferences” from the bottom left menu in the preferences
|
||||
screen, to ensure the addons remains enabled
|
||||
Step 8: Close user preferences
|
||||
Step 9: Go to the 3D editor, press "n" or otherwise open the toolshelf,
|
||||
click on the "Poliigon" tab
|
||||
|
||||
Need further help? You can start a conversation with the team here:
|
||||
https://help.poliigon.com/en/articles/6342599-poliigon-blender-addon
|
||||
@@ -0,0 +1,207 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
bl_info = {
|
||||
"name": "Poliigon",
|
||||
"author": "Poliigon",
|
||||
"version": (1, 12, 1),
|
||||
"blender": (2, 83, 0),
|
||||
"location": "3D View",
|
||||
"description": "Load models, textures, and more from Poliigon and locally",
|
||||
"doc_url": "https://help.poliigon.com/en/articles/6342599-poliigon-blender-addon-2023?utm_source=blender&utm_medium=addon", # noqa: E501
|
||||
"tracker_url": "https://help.poliigon.com/en/?utm_source=blender&utm_medium=addon", # noqa: E501
|
||||
"category": "3D View",
|
||||
}
|
||||
|
||||
|
||||
if "bpy" in locals():
|
||||
import importlib
|
||||
import bpy
|
||||
|
||||
importlib.reload(dlg_account) # noqa: F821
|
||||
importlib.reload(dlg_add_node_groups) # noqa: F821
|
||||
importlib.reload(dlg_area_categories) # noqa: F821
|
||||
importlib.reload(dlg_area_notifications) # noqa: F821
|
||||
importlib.reload(dlg_area_tabs) # noqa: F821
|
||||
importlib.reload(dlg_assets) # noqa: F821
|
||||
importlib.reload(dlg_init_library) # noqa: F821
|
||||
importlib.reload(dlg_login) # noqa: F821
|
||||
importlib.reload(dlg_popup) # noqa: F821
|
||||
importlib.reload(dlg_quickmenu) # noqa: F821
|
||||
importlib.reload(utils_dlg) # noqa: F821
|
||||
importlib.reload(operator_active) # noqa: F821
|
||||
importlib.reload(operator_add_converter_node) # noqa: F821
|
||||
importlib.reload(operator_apply) # noqa: F821
|
||||
importlib.reload(operator_cancel_download) # noqa: F821
|
||||
importlib.reload(operator_category) # noqa: F821
|
||||
importlib.reload(operator_check_update) # noqa: F821
|
||||
importlib.reload(operator_close_notification) # noqa: F821
|
||||
importlib.reload(operator_detail) # noqa: F821
|
||||
importlib.reload(operator_directory) # noqa: F821
|
||||
importlib.reload(operator_download) # noqa: F821
|
||||
importlib.reload(operator_folder) # noqa: F821
|
||||
importlib.reload(operator_hdri) # noqa: F821
|
||||
importlib.reload(operator_library) # noqa: F821
|
||||
importlib.reload(operator_link) # noqa: F821
|
||||
importlib.reload(operator_local_asset_sync) # noqa: F821
|
||||
importlib.reload(operator_load_asset_from_list) # noqa: F821
|
||||
importlib.reload(operator_material) # noqa: F821
|
||||
importlib.reload(operator_model) # noqa: F821
|
||||
importlib.reload(operator_options) # noqa: F821
|
||||
importlib.reload(operator_notice) # noqa: F821
|
||||
importlib.reload(operator_popup_message) # noqa: F821
|
||||
importlib.reload(operator_preview) # noqa: F821
|
||||
importlib.reload(operator_refresh_data) # noqa: F821
|
||||
importlib.reload(operator_report_error) # noqa: F821
|
||||
importlib.reload(operator_select) # noqa: F821
|
||||
importlib.reload(operator_setting) # noqa: F821
|
||||
importlib.reload(operator_show_preferences) # noqa: F821
|
||||
importlib.reload(operator_show_quick_menu) # noqa: F821
|
||||
importlib.reload(operator_unsupported_convention) # noqa: F821
|
||||
importlib.reload(operator_user) # noqa: F821
|
||||
importlib.reload(operator_view_thumbnail) # noqa: F821
|
||||
importlib.reload(register_operators) # noqa: F821
|
||||
importlib.reload(utils_operator) # noqa: F821
|
||||
importlib.reload(constants) # noqa: F821
|
||||
importlib.reload(preferences_map_prefs) # noqa: F821
|
||||
importlib.reload(preferences) # noqa: F821
|
||||
importlib.reload(props) # noqa: F821
|
||||
importlib.reload(reporting) # noqa: F821
|
||||
importlib.reload(toolbox) # noqa: F821
|
||||
importlib.reload(toolbox_settings) # noqa: F821
|
||||
importlib.reload(ui) # noqa: F821
|
||||
if bpy.app.version >= (3, 0):
|
||||
importlib.reload(asset_browser_sync_commands) # noqa: F821
|
||||
importlib.reload(asset_browser) # noqa: F821
|
||||
importlib.reload(asset_browser_ui) # noqa: F821
|
||||
importlib.reload(asset_browser_operator_import) # noqa: F821
|
||||
importlib.reload(asset_browser_operator_quick_menu) # noqa: F821
|
||||
importlib.reload(asset_browser_operator_reprocess) # noqa: F821
|
||||
importlib.reload(asset_browser_operator_sync_cancel) # noqa: F821
|
||||
importlib.reload(asset_browser_operator_sync_client) # noqa: F821
|
||||
importlib.reload(asset_browser_operator_update) # noqa: F821
|
||||
importlib.reload(asset_browser_operators) # noqa: F821
|
||||
importlib.reload(api) # noqa: F821
|
||||
importlib.reload(env) # noqa: F821
|
||||
importlib.reload(updater) # noqa: F821
|
||||
else:
|
||||
import bpy
|
||||
|
||||
from .dialogs import dlg_account # noqa: F401
|
||||
from .dialogs import dlg_add_node_groups # noqa: F401
|
||||
from .dialogs import dlg_area_categories # noqa: F401
|
||||
from .dialogs import dlg_area_notifications # noqa: F401
|
||||
from .dialogs import dlg_area_tabs # noqa: F401
|
||||
from .dialogs import dlg_assets # noqa: F401
|
||||
from .dialogs import dlg_init_library # noqa: F401
|
||||
from .dialogs import dlg_login # noqa: F401
|
||||
from .dialogs import dlg_popup # noqa: F401
|
||||
from .dialogs import dlg_quickmenu # noqa: F401
|
||||
from .dialogs import utils_dlg # noqa: F401
|
||||
from .operators import operator_active # noqa: F401
|
||||
from .operators import operator_add_converter_node # noqa: F401
|
||||
from .operators import operator_apply # noqa: F401
|
||||
from .operators import operator_cancel_download # noqa: F401
|
||||
from .operators import operator_category # noqa: F401
|
||||
from .operators import operator_check_update # noqa: F401
|
||||
from .operators import operator_close_notification # noqa: F401
|
||||
from .operators import operator_detail # noqa: F401
|
||||
from .operators import operator_directory # noqa: F401
|
||||
from .operators import operator_download # noqa: F401
|
||||
from .operators import operator_folder # noqa: F401
|
||||
from .operators import operator_hdri # noqa: F401
|
||||
from .operators import operator_library # noqa: F401
|
||||
from .operators import operator_link # noqa: F401
|
||||
from .operators import operator_local_asset_sync # noqa: F401
|
||||
from .operators import operator_load_asset_from_list # noqa: F401
|
||||
from .operators import operator_material # noqa: F401
|
||||
from .operators import operator_model # noqa: F401
|
||||
from .operators import operator_options # noqa: F401
|
||||
from .operators import operator_notice # noqa: F401
|
||||
from .operators import operator_popup_message # noqa: F401
|
||||
from .operators import operator_preview # noqa: F401
|
||||
from .operators import operator_refresh_data # noqa: F401
|
||||
from .operators import operator_report_error # noqa: F401
|
||||
from .operators import operator_select # noqa: F401
|
||||
from .operators import operator_setting # noqa: F401
|
||||
from .operators import operator_show_preferences # noqa: F401
|
||||
from .operators import operator_show_quick_menu # noqa: F401
|
||||
from .operators import operator_unsupported_convention # noqa: F401
|
||||
from .operators import operator_user # noqa: F401
|
||||
from .operators import operator_view_thumbnail # noqa: F401
|
||||
from .operators import register_operators
|
||||
from .operators import utils_operator # noqa: F401
|
||||
from . import constants # noqa: F401
|
||||
from . import preferences_map_prefs
|
||||
from . import preferences
|
||||
from . import props
|
||||
from . import reporting # noqa: F401
|
||||
from . import toolbox
|
||||
from . import toolbox_settings # noqa: F401
|
||||
from . import ui
|
||||
if bpy.app.version >= (3, 0):
|
||||
from .asset_browser import asset_browser_sync_commands # noqa: F401
|
||||
from .asset_browser import asset_browser # noqa: F401
|
||||
from .asset_browser import asset_browser_ui
|
||||
from .asset_browser import asset_browser_operator_import # noqa: F401
|
||||
from .asset_browser import asset_browser_operator_quick_menu # noqa: F401
|
||||
from .asset_browser import asset_browser_operator_reprocess # noqa: F401
|
||||
from .asset_browser import asset_browser_operator_sync_cancel # noqa: F401
|
||||
from .asset_browser import asset_browser_operator_sync_client # noqa: F401
|
||||
from .asset_browser import asset_browser_operator_update # noqa: F401
|
||||
from .asset_browser import asset_browser_operators
|
||||
from .modules.poliigon_core import api # noqa: F401, needed for tests
|
||||
from .modules.poliigon_core import env # noqa: F401, needed for tests
|
||||
from .modules.poliigon_core import updater # noqa: F401, needed for tests
|
||||
|
||||
|
||||
def register():
|
||||
aver = ".".join([str(x) for x in bl_info["version"]])
|
||||
|
||||
toolbox.init_context(aver)
|
||||
|
||||
preferences_map_prefs.register(aver) # needed in props
|
||||
props.register()
|
||||
preferences.register(aver)
|
||||
toolbox.register(aver)
|
||||
register_operators.register(aver)
|
||||
ui.register(aver)
|
||||
if bpy.app.version >= (3, 0):
|
||||
asset_browser.register(aver)
|
||||
asset_browser_operators.register(aver)
|
||||
asset_browser_ui.register(aver)
|
||||
|
||||
|
||||
def unregister():
|
||||
# Reverse order of register.
|
||||
if toolbox.cTB is not None:
|
||||
toolbox.shutdown_addon()
|
||||
if bpy.app.version >= (3, 0):
|
||||
asset_browser_ui.unregister()
|
||||
asset_browser_operators.unregister()
|
||||
asset_browser.unregister()
|
||||
ui.unregister()
|
||||
register_operators.unregister()
|
||||
toolbox.unregister()
|
||||
preferences.unregister()
|
||||
props.unregister()
|
||||
preferences_map_prefs.unregister()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
register()
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,100 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from bpy.types import Operator
|
||||
|
||||
from ..modules.poliigon_core.assets import AssetType
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from .. import reporting
|
||||
from ..toolbox import get_context
|
||||
from . import asset_browser as ab
|
||||
|
||||
|
||||
# https://blender.stackexchange.com/questions/249837/how-do-i-get-the-selected-assets-in-the-asset-browser-using-the-api
|
||||
class POLIIGON_OT_asset_browser_import(Operator):
|
||||
bl_idname = "poliigon.asset_browser_import"
|
||||
bl_label = _t("Import Selected Assets")
|
||||
bl_space_type = "FILE_BROWSER"
|
||||
|
||||
@staticmethod
|
||||
def init_context(addon_version: str) -> None:
|
||||
"""Called from operators.py to init global addon context."""
|
||||
|
||||
global cTB
|
||||
cTB = get_context(addon_version)
|
||||
|
||||
@classmethod
|
||||
def poll(cls, context):
|
||||
is_poliigon_lib = ab.is_poliigon_library(context)
|
||||
assets_selected = ab.get_num_selected_assets(context) > 0
|
||||
return is_poliigon_lib and assets_selected
|
||||
|
||||
@classmethod
|
||||
def description(cls, context, properties):
|
||||
num_selected = ab.get_num_selected_assets(context)
|
||||
if num_selected > 0:
|
||||
return _t("Import selected assets (default parameters)")
|
||||
else:
|
||||
return _t("No asset selected.\nPlease, select an asset")
|
||||
|
||||
@reporting.handle_operator(silent=True)
|
||||
def execute(self, context):
|
||||
if not ab.is_poliigon_library(context):
|
||||
# As the operator should be shown for Poliigon Library, only
|
||||
# we shouldn't be here
|
||||
error_msg = ("POLIIGON_OT_asset_browser_import(): "
|
||||
"Poliigon library not selected!")
|
||||
reporting.capture_message(
|
||||
"asset_browser_lib_not_sel", error_msg, "error")
|
||||
return {"CANCELLED"}
|
||||
|
||||
asset_files = ab.get_selected_assets(context)
|
||||
|
||||
for _asset_file in asset_files:
|
||||
asset_name = ab.get_asset_name_from_browser_asset(_asset_file)
|
||||
asset_data = ab.get_asset_data_from_browser_asset(_asset_file)
|
||||
if asset_data is None:
|
||||
error_msg = ("POLIIGON_OT_asset_browser_import(): "
|
||||
f"Asset {asset_name} not found!")
|
||||
reporting.capture_message(
|
||||
"asset_browser_asset_not_found", error_msg, "error")
|
||||
cTB.logger_ab.error(error_msg)
|
||||
# TODO(Andreas): user notification
|
||||
continue
|
||||
|
||||
asset_type = asset_data.asset_type
|
||||
if asset_type == AssetType.HDRI:
|
||||
# TODO(Andreas): Do actual import
|
||||
pass
|
||||
elif asset_type == AssetType.MODEL:
|
||||
# TODO(Andreas): Do actual import
|
||||
pass
|
||||
elif asset_type == AssetType.TEXTURE:
|
||||
# TODO(Andreas): Do actual import
|
||||
pass
|
||||
else:
|
||||
error_msg = ("POLIIGON_OT_asset_browser_import():"
|
||||
f" Unexpected asset type: {asset_name} "
|
||||
f"{asset_type}")
|
||||
reporting.capture_message(
|
||||
"asset_browser_unexpected_type", error_msg, "error")
|
||||
cTB.logger_ab.error(error_msg)
|
||||
# TODO(Andreas): user notification
|
||||
continue
|
||||
|
||||
return {"FINISHED"}
|
||||
+84
@@ -0,0 +1,84 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from bpy.types import Operator
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from ..dialogs.dlg_quickmenu import show_quick_menu
|
||||
from ..toolbox import get_context
|
||||
from .. import reporting
|
||||
from . import asset_browser as ab
|
||||
|
||||
|
||||
# https://blender.stackexchange.com/questions/249837/how-do-i-get-the-selected-assets-in-the-asset-browser-using-the-api
|
||||
class POLIIGON_OT_asset_browser_quick_menu(Operator):
|
||||
bl_idname = "poliigon.asset_browser_quick_menu"
|
||||
bl_label = _t("Show additional import options")
|
||||
bl_space_type = "FILE_BROWSER"
|
||||
|
||||
@staticmethod
|
||||
def init_context(addon_version: str) -> None:
|
||||
"""Called from operators.py to init global addon context."""
|
||||
|
||||
global cTB
|
||||
cTB = get_context(addon_version)
|
||||
|
||||
@classmethod
|
||||
def poll(cls, context):
|
||||
is_poliigon_lib = ab.is_poliigon_library(context)
|
||||
one_asset_selected = ab.get_num_selected_assets(context) == 1
|
||||
return is_poliigon_lib and one_asset_selected
|
||||
|
||||
@classmethod
|
||||
def description(cls, context, properties):
|
||||
num_selected = ab.get_num_selected_assets(context)
|
||||
if num_selected == 1:
|
||||
return _t("Show additional import options")
|
||||
elif num_selected == 0:
|
||||
return _t("No asset selected.\nPlease, select a single asset")
|
||||
else:
|
||||
return _t("Multiple assets selected.\nPlease, "
|
||||
"select a single asset, only")
|
||||
|
||||
@reporting.handle_operator(silent=True)
|
||||
def execute(self, context):
|
||||
if not ab.is_poliigon_library(context):
|
||||
# As the operator should be shown for Poliigon Library
|
||||
# we shouldn't be here
|
||||
error_msg = ("POLIIGON_OT_asset_browser_quick_menu(): "
|
||||
"Poliigon library not selected!")
|
||||
reporting.capture_message(
|
||||
"asset_browser_lib_not_sel", error_msg, "error")
|
||||
return {"CANCELLED"}
|
||||
|
||||
# poll() makes sure, there's exactly one
|
||||
asset_file = ab.get_selected_assets(context)[0]
|
||||
|
||||
asset_name = ab.get_asset_name_from_browser_asset(asset_file)
|
||||
asset_data = ab.get_asset_data_from_browser_asset(asset_file)
|
||||
|
||||
if asset_data is None:
|
||||
error_msg = ("POLIIGON_OT_asset_browser_import(): "
|
||||
f"Asset {asset_name} not found!")
|
||||
reporting.capture_message(
|
||||
"asset_browser_asset_not_found", error_msg, "error")
|
||||
self.report({"ERROR"}, f"Asset {asset_name} not found!")
|
||||
return {"CANCELLED"}
|
||||
|
||||
show_quick_menu(cTB, asset_data=asset_data)
|
||||
return {"FINISHED"}
|
||||
+88
@@ -0,0 +1,88 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from bpy.types import Operator
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from ..toolbox import get_context
|
||||
from .. import reporting
|
||||
from . import asset_browser as ab
|
||||
|
||||
|
||||
class POLIIGON_OT_asset_browser_reprocess(Operator):
|
||||
bl_idname = "poliigon.asset_browser_reprocess"
|
||||
bl_label = _t("Show additional import options")
|
||||
bl_space_type = "FILE_BROWSER"
|
||||
|
||||
@staticmethod
|
||||
def init_context(addon_version: str) -> None:
|
||||
"""Called from operators.py to init global addon context."""
|
||||
|
||||
global cTB
|
||||
cTB = get_context(addon_version)
|
||||
|
||||
@classmethod
|
||||
def poll(cls, context):
|
||||
if not ab.is_asset_browser(context):
|
||||
return False
|
||||
if not ab.is_poliigon_library(context):
|
||||
return False
|
||||
if not ab.is_only_poliigon_selected(context):
|
||||
return False
|
||||
if ab.get_num_selected_assets(context) == 0:
|
||||
return False
|
||||
return True
|
||||
|
||||
@classmethod
|
||||
def description(cls, context, properties):
|
||||
num_selected = ab.get_num_selected_assets(context)
|
||||
if num_selected == 0:
|
||||
return _t("No asset selected.\nPlease, select a single asset")
|
||||
else:
|
||||
return _t("Re-process selected assets")
|
||||
|
||||
@reporting.handle_operator(silent=True)
|
||||
def execute(self, context):
|
||||
if not ab.is_poliigon_library(context):
|
||||
# As the operator should be shown for Poliigon Library
|
||||
# we shouldn't be here
|
||||
error_msg = ("POLIIGON_OT_asset_browser_reprocess(): "
|
||||
"Poliigon library not selected!")
|
||||
reporting.capture_message(
|
||||
"asset_browser_reproc_not_sel", error_msg, "error")
|
||||
return {"CANCELLED"}
|
||||
|
||||
asset_files = ab.get_selected_assets(context)
|
||||
|
||||
for asset_file in asset_files:
|
||||
asset_name = ab.get_asset_name_from_browser_asset(asset_file)
|
||||
asset_data = ab.get_asset_data_from_browser_asset(asset_file)
|
||||
|
||||
if asset_data is None:
|
||||
error_msg = ("POLIIGON_OT_asset_browser_reprocess(): "
|
||||
f"Asset {asset_name} not found!")
|
||||
reporting.capture_message(
|
||||
"asset_browser_reproc_asset_missing", error_msg, "error")
|
||||
self.report({"ERROR"}, f"Asset {asset_name} not found!")
|
||||
continue
|
||||
|
||||
bpy.ops.poliigon.update_asset_browser(
|
||||
asset_id=asset_data.asset_id, force=True)
|
||||
|
||||
return {"FINISHED"}
|
||||
+44
@@ -0,0 +1,44 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from bpy.types import Operator
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from ..toolbox import get_context
|
||||
from .. import reporting
|
||||
|
||||
|
||||
class POLIIGON_OT_cancel_asset_browser_sync(Operator):
|
||||
bl_idname = "poliigon.cancel_asset_browser"
|
||||
bl_label = _t("Cancel Asset Sync")
|
||||
bl_category = "Poliigon"
|
||||
bl_description = _t("Cancel synchronization of local assets with the "
|
||||
"Asset Browser")
|
||||
bl_options = {"INTERNAL"}
|
||||
|
||||
@staticmethod
|
||||
def init_context(addon_version: str) -> None:
|
||||
"""Called from operators.py to init global addon context."""
|
||||
|
||||
global cTB
|
||||
cTB = get_context(addon_version)
|
||||
|
||||
@reporting.handle_operator(silent=True)
|
||||
def execute(self, context):
|
||||
cTB.asset_browser_jobs_cancelled = True
|
||||
return {"FINISHED"}
|
||||
+963
@@ -0,0 +1,963 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from dataclasses import dataclass
|
||||
import json
|
||||
import os
|
||||
from queue import (
|
||||
Empty,
|
||||
Queue)
|
||||
import shutil
|
||||
import sys
|
||||
from threading import Thread
|
||||
import time
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
from uuid import uuid4
|
||||
|
||||
from bpy.types import Operator
|
||||
from bpy.props import StringProperty
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.assets import (
|
||||
AssetData,
|
||||
AssetType)
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from ..material_import_utils import ASSET_TYPE_TO_IMPORTED_TYPE
|
||||
from ..toolbox import get_context
|
||||
from .. import reporting
|
||||
from .asset_browser_sync_commands import (
|
||||
CMD_MARKER_START,
|
||||
CMD_MARKER_END,
|
||||
SyncCmd,
|
||||
SyncAssetBrowserCmd)
|
||||
|
||||
|
||||
@dataclass
|
||||
class ScriptContext():
|
||||
path_cat: Optional[str] = None # from command line args
|
||||
path_categories: Optional[str] = None # from command line args
|
||||
|
||||
poliigon_categories: Optional[Dict] = None
|
||||
|
||||
listener_running: bool = False
|
||||
thd_listener: Optional[Thread] = None
|
||||
|
||||
sender_running: bool = False
|
||||
thd_sender: Optional[Thread] = None
|
||||
|
||||
queue_cmd: Optional[Queue] = None
|
||||
queue_send: Optional[Queue] = None
|
||||
queue_ack: Optional[Queue] = None
|
||||
|
||||
main_running: bool = False
|
||||
|
||||
|
||||
class POLIIGON_OT_sync_client(Operator):
|
||||
bl_idname = "poliigon.asset_browser_sync_client"
|
||||
bl_label = _t("Sync Client")
|
||||
bl_category = "Poliigon"
|
||||
bl_description = _t("To be used in client Blender process to work on "
|
||||
"commands sent by host blender.")
|
||||
bl_options = {"INTERNAL"}
|
||||
|
||||
path_catalog: StringProperty(options={"HIDDEN"}) # noqa: F821
|
||||
path_categories: StringProperty(options={"HIDDEN"}) # noqa: F821
|
||||
|
||||
@staticmethod
|
||||
def init_context(addon_version: str) -> None:
|
||||
"""Called from operators.py to init global addon context."""
|
||||
|
||||
global cTB
|
||||
cTB = get_context(addon_version)
|
||||
|
||||
@staticmethod
|
||||
def _check_command(ctx: ScriptContext,
|
||||
buf: str
|
||||
) -> Tuple[Optional[SyncAssetBrowserCmd], str]:
|
||||
"""Returns a valid command, otherwise None.
|
||||
Upon detecting a corrupted command, CMD_ERROR gets sent.
|
||||
|
||||
Return value:
|
||||
Tuple with two entries:
|
||||
Tuple[0]: A valid command or None
|
||||
Tuple[1]: Remaining buf after either a valid command got detected or
|
||||
an broken command got removed
|
||||
"""
|
||||
|
||||
if CMD_MARKER_END not in buf:
|
||||
return None, buf
|
||||
|
||||
pos_delimiter = buf.find(CMD_MARKER_END, 1)
|
||||
cmd_json = buf[:pos_delimiter]
|
||||
buf = buf[pos_delimiter + len(CMD_MARKER_END):]
|
||||
|
||||
if CMD_MARKER_START in cmd_json:
|
||||
pos_marker_start = cmd_json.find(CMD_MARKER_START, 1)
|
||||
cmd_json = cmd_json[pos_marker_start + len(CMD_MARKER_START):]
|
||||
else:
|
||||
ctx.queue_send.put(SyncAssetBrowserCmd(code=SyncCmd.CMD_ERROR))
|
||||
cmd_json = None
|
||||
return cmd_json, buf
|
||||
|
||||
@staticmethod
|
||||
def _thread_listener(ctx: ScriptContext) -> None:
|
||||
"""Listens to commands sent by host and checks their integrity.
|
||||
|
||||
In case of error requests a command to be re-send from host via
|
||||
CMD_ERROR.
|
||||
Valid commands are then sorted into two queues, one for received acks
|
||||
(forwarding them to unblock sender), one for job commands (forwarding
|
||||
them to main loop).
|
||||
"""
|
||||
|
||||
cTB.logger_ab.debug("thread_listener")
|
||||
ctx.listener_running = True
|
||||
buf = ""
|
||||
while ctx.listener_running:
|
||||
# Wait for messages from host, concatenating received lines
|
||||
# into buf
|
||||
try:
|
||||
buf += sys.stdin.readline()
|
||||
except KeyboardInterrupt:
|
||||
time.sleep(0.5)
|
||||
if ctx.listener_running:
|
||||
continue
|
||||
|
||||
if not ctx.listener_running:
|
||||
break
|
||||
|
||||
cmd_json, buf = POLIIGON_OT_sync_client._check_command(ctx, buf)
|
||||
if cmd_json is None:
|
||||
continue
|
||||
|
||||
try:
|
||||
cmd_from_host = SyncAssetBrowserCmd.from_json(cmd_json)
|
||||
if cmd_from_host.code == SyncCmd.CMD_ERROR:
|
||||
# Forward ack to thread_sender
|
||||
ctx.queue_ack.put(cmd_from_host)
|
||||
else:
|
||||
# Forward job command to main loop
|
||||
ctx.queue_cmd.put(cmd_from_host)
|
||||
except Exception:
|
||||
ctx.queue_send.put(SyncAssetBrowserCmd(code=SyncCmd.CMD_ERROR))
|
||||
cTB.logger_ab.exception(f"CMD ERROR {cmd_json}")
|
||||
|
||||
cTB.logger_ab.debug("thread_listener EXIT")
|
||||
ctx.thd_listener = None
|
||||
|
||||
@staticmethod
|
||||
def _start_listener(ctx: ScriptContext) -> None:
|
||||
"""Starts thread_listener()"""
|
||||
|
||||
ctx.thd_listener = Thread(
|
||||
target=POLIIGON_OT_sync_client._thread_listener,
|
||||
args=(ctx, ),
|
||||
daemon=True)
|
||||
ctx.thd_listener.start()
|
||||
|
||||
@staticmethod
|
||||
def _flush_queue_ack(ctx: ScriptContext) -> None:
|
||||
"""Removes all content from ack queue"""
|
||||
|
||||
while not ctx.queue_ack.empty():
|
||||
try:
|
||||
ctx.queue_ack.get_nowait()
|
||||
except ctx.queue_ack.Empty:
|
||||
break
|
||||
|
||||
@staticmethod
|
||||
def _shutdown_on_error(ctx: ScriptContext) -> None:
|
||||
"""Shuts the client down"""
|
||||
|
||||
# The client quits and host will pick up the "client loss" due to
|
||||
# timeouts
|
||||
ctx.sender_running = False
|
||||
ctx.listener_running = False
|
||||
ctx.main_running = False
|
||||
sys.stdin.close() # unblock listener
|
||||
|
||||
@staticmethod
|
||||
def _thread_sender(ctx: ScriptContext) -> None:
|
||||
"""Sends commands to host.
|
||||
|
||||
For commands expecting an acknowledge message the thread will then
|
||||
block until the ack is received (or possibly resend the command if
|
||||
CMD_ERROR is received).
|
||||
"""
|
||||
|
||||
cTB.logger_ab.debug("thread_sender")
|
||||
ctx.sender_running = True
|
||||
while ctx.sender_running:
|
||||
# Get rid of any unwanted acks from previous commands
|
||||
POLIIGON_OT_sync_client._flush_queue_ack(ctx)
|
||||
|
||||
# Wait for something to send
|
||||
try:
|
||||
cmd_send = ctx.queue_send.get(timeout=1.0)
|
||||
ctx.queue_send.task_done()
|
||||
except Empty:
|
||||
if ctx.sender_running:
|
||||
continue
|
||||
|
||||
if not ctx.sender_running:
|
||||
break
|
||||
|
||||
cTB.logger_ab.debug(f"Send: {cmd_send.code.name}")
|
||||
cmd_send.send_to_stdio()
|
||||
|
||||
# Depending on sent command code, we are already done
|
||||
if cmd_send.code in [SyncCmd.ASSET_OK, SyncCmd.ASSET_ERROR]:
|
||||
# ASSET_OK, ASSET_ERROR are fire and forget,
|
||||
# just proceed with next command
|
||||
continue
|
||||
elif cmd_send.code == SyncCmd.EXIT_ACK:
|
||||
# EXIT_ACK is fire and forget, we are done here
|
||||
ctx.sender_running = False
|
||||
break
|
||||
|
||||
# Wait for acknowledge message
|
||||
# TODO(Andreas): Currently this low retry count causes issues with
|
||||
# Patrick's library on a NAS and then likely exposes
|
||||
# a bug in timeout handling.
|
||||
retries = 3
|
||||
while retries > 0 and ctx.sender_running:
|
||||
try:
|
||||
cmd_ack = ctx.queue_ack.get(timeout=15.0)
|
||||
ctx.queue_ack.task_done()
|
||||
except Empty:
|
||||
cmd_ack = None
|
||||
|
||||
retries -= 1
|
||||
if cmd_ack is None:
|
||||
# queue timeout,
|
||||
# unless retries are exhausted continue to wait
|
||||
if retries == 0:
|
||||
# Unlikely we can gracefully recover
|
||||
POLIIGON_OT_sync_client._shutdown_on_error(ctx)
|
||||
break
|
||||
elif cmd_ack.code == SyncCmd.CMD_ERROR:
|
||||
# last sent command was not received well -> resend
|
||||
if retries > 0:
|
||||
cmd_send.send_to_stdio()
|
||||
else:
|
||||
# Unlikely we can gracefully recover
|
||||
POLIIGON_OT_sync_client._shutdown_on_error(ctx)
|
||||
elif cmd_ack.code == SyncCmd.CMD_DONE:
|
||||
# last sent command was ok, continue with next
|
||||
break
|
||||
|
||||
cTB.logger_ab.debug("thread_sender EXIT")
|
||||
ctx.thd_sender = None
|
||||
|
||||
@staticmethod
|
||||
def _start_sender(ctx: ScriptContext) -> None:
|
||||
"""Starts thread_sender()"""
|
||||
|
||||
ctx.thd_sender = Thread(
|
||||
target=POLIIGON_OT_sync_client._thread_sender,
|
||||
args=(ctx, ),
|
||||
daemon=True)
|
||||
ctx.thd_sender.start()
|
||||
|
||||
@staticmethod
|
||||
def _startup(ctx: ScriptContext) -> None:
|
||||
cTB.logger_ab.debug("waiting for asset data...")
|
||||
bpy.ops.poliigon.get_local_asset_sync(
|
||||
await_startup_poliigon=False,
|
||||
await_startup_my_assets=True,
|
||||
get_poliigon=False,
|
||||
get_my_assets=False,
|
||||
abort_ongoing_jobs=False)
|
||||
cTB.logger_ab.debug("...done.")
|
||||
|
||||
if not POLIIGON_OT_sync_client._read_poliigon_categories(ctx):
|
||||
return False
|
||||
|
||||
ctx.queue_cmd = Queue()
|
||||
ctx.queue_send = Queue()
|
||||
ctx.queue_ack = Queue()
|
||||
|
||||
POLIIGON_OT_sync_client._start_listener(ctx)
|
||||
POLIIGON_OT_sync_client._start_sender(ctx)
|
||||
|
||||
ctx.queue_send.put(SyncAssetBrowserCmd(code=SyncCmd.HELLO))
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def _reset_blend():
|
||||
"""Prepares a fresh blend file for stuff to be imported into."""
|
||||
|
||||
bpy.ops.wm.read_homefile(use_empty=True)
|
||||
|
||||
# To be safe deselect all
|
||||
for obj in bpy.data.objects:
|
||||
obj.select_set(False)
|
||||
|
||||
@staticmethod
|
||||
def _save_blend(path: str) -> bool:
|
||||
"""Saves the current blend file."""
|
||||
|
||||
# Remove previous file
|
||||
# (host will re-process assets only, if force parameter was set)
|
||||
if os.path.exists(path):
|
||||
os.remove(path)
|
||||
|
||||
path_norm = os.path.normpath(path)
|
||||
result = bpy.ops.wm.save_mainfile(filepath=path_norm,
|
||||
check_existing=False,
|
||||
exit=False)
|
||||
return result == {"FINISHED"}
|
||||
|
||||
@staticmethod
|
||||
def _get_unique_uuid(catalog_dict: Dict) -> str:
|
||||
"""Returns a new, random UUID, which does not already exist
|
||||
in catalog.
|
||||
"""
|
||||
|
||||
uuid_is_unique = False
|
||||
while not uuid_is_unique:
|
||||
uuid_result = str(uuid4())
|
||||
uuid_is_unique = True
|
||||
for uuid_existing, _, _ in catalog_dict.values():
|
||||
if uuid_result == uuid_existing:
|
||||
uuid_is_unique = False
|
||||
break
|
||||
return uuid_result
|
||||
|
||||
# Based on code from:
|
||||
# https://blender.stackexchange.com/questions/249316/python-set-asset-library-tags-and-catalogs
|
||||
@staticmethod
|
||||
def _get_catalog_dict(ctx: ScriptContext) -> Dict:
|
||||
"""Reads blender's catalogue and returns a dictionary with its content.
|
||||
|
||||
Return value:
|
||||
Dict: {catalog tree path: (uuid, catalog tree path, catalog name)}
|
||||
"""
|
||||
|
||||
if not os.path.exists(ctx.path_cat):
|
||||
return {}
|
||||
catalogs = {}
|
||||
with open(ctx.path_cat, "r") as file_catalogs:
|
||||
for line in file_catalogs.readlines():
|
||||
if line.startswith(("#", "VERSION", "\n")):
|
||||
continue
|
||||
# Each line contains:
|
||||
# 'uuid:catalog_tree:catalog_name' + eol ('\n')
|
||||
uuid, tree_path, name = line.split(":")
|
||||
name = name.split("\n")[0]
|
||||
catalogs[tree_path] = (uuid, tree_path, name)
|
||||
return catalogs
|
||||
|
||||
@staticmethod
|
||||
def _catalog_file_header(version: int = 1):
|
||||
"""Returns the standard header of a catalog file."""
|
||||
|
||||
header = (
|
||||
"# This is an Asset Catalog Definition file for Blender.\n"
|
||||
"#\n"
|
||||
"# Empty lines and lines starting with `#` will be ignored.\n"
|
||||
"# The first non-ignored line should be the version indicator.\n"
|
||||
'# Other lines are of the format "UUID:catalog/path/for/assets:simple catalog name"\n'
|
||||
"\n"
|
||||
f"VERSION {version}\n"
|
||||
"\n")
|
||||
return header
|
||||
|
||||
@staticmethod
|
||||
def _write_catalog_file(ctx: ScriptContext, catalog_dict: Dict) -> bool:
|
||||
"""Writes a catalog dict into a new catalog file,
|
||||
replacing the old file upon success.
|
||||
"""
|
||||
|
||||
path_cat_temp = ctx.path_cat + ".TEMP"
|
||||
path_cat_bak = ctx.path_cat + ".BAK"
|
||||
try:
|
||||
# Write into temporary file
|
||||
with open(path_cat_temp, "w") as file_catalogs:
|
||||
header = POLIIGON_OT_sync_client._catalog_file_header()
|
||||
file_catalogs.write(header)
|
||||
for _uuid, tree_path, name in catalog_dict.values():
|
||||
file_catalogs.write(f"{_uuid}:{tree_path}:{name}\n")
|
||||
|
||||
# Replace existing catalog file (if any) with above temporary file
|
||||
if os.path.exists(ctx.path_cat):
|
||||
shutil.move(ctx.path_cat, path_cat_bak)
|
||||
shutil.move(path_cat_temp, ctx.path_cat)
|
||||
if os.path.exists(path_cat_bak):
|
||||
os.remove(path_cat_bak)
|
||||
except IsADirectoryError:
|
||||
# Should not occur, it's our files
|
||||
cTB.logger_ab.exception("IsADirectoryError")
|
||||
return False
|
||||
except FileNotFoundError:
|
||||
# Should not occur, it's being tested above
|
||||
cTB.logger_ab.exception("FileNotFoundError")
|
||||
return False
|
||||
except OSError:
|
||||
# Faied to create file
|
||||
cTB.logger_ab.exception("OSError")
|
||||
return False
|
||||
except Exception:
|
||||
cTB.logger_ab.exception("Unexpected exception!")
|
||||
return False
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def _read_poliigon_categories(ctx: ScriptContext) -> bool:
|
||||
"""Reads all Poliigon categories into a dict in context."""
|
||||
|
||||
if not os.path.exists(ctx.path_categories):
|
||||
cTB.logger_ab.debug("Poliigon categories file missing")
|
||||
ctx.poliigon_categories = {"HDRIs": [],
|
||||
"Models": [],
|
||||
"Textures": []
|
||||
}
|
||||
return False
|
||||
|
||||
with open(ctx.path_categories, "r") as file_categories:
|
||||
try:
|
||||
ctx.poliigon_categories = json.load(file_categories)
|
||||
except json.JSONDecodeError:
|
||||
cTB.logger_ab.debug("Poliigon's category file is corrupt!")
|
||||
return False
|
||||
|
||||
ctx.poliigon_categories = ctx.poliigon_categories["poliigon"]
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def _get_unique_category_list(
|
||||
ctx: ScriptContext, asset_data: AssetData) -> List[str]:
|
||||
"""Returns a list of categories matching the first (alphabetically)
|
||||
matching branch in Poliigon's category tree."""
|
||||
|
||||
asset_type = asset_data.asset_type
|
||||
asset_name = asset_data.asset_name
|
||||
|
||||
asset_type_cat = ASSET_TYPE_TO_IMPORTED_TYPE[asset_type]
|
||||
if asset_type_cat not in ctx.poliigon_categories:
|
||||
cTB.logger_ab.debug("!!! Asset type not found "
|
||||
f"{asset_name} {asset_type}")
|
||||
cTB.logger_ab.debug(" Category types "
|
||||
f"{list(ctx.poliigon_categories.keys())}")
|
||||
return [asset_type.name]
|
||||
|
||||
# Have copy, as we are removing some categorie during the process
|
||||
asset_categories = asset_data.categories.copy()
|
||||
|
||||
if "free" in asset_categories:
|
||||
asset_categories.remove("free")
|
||||
if asset_type_cat in asset_categories:
|
||||
# It gets prepended anyway in next step
|
||||
asset_categories.remove(asset_type_cat)
|
||||
|
||||
category_list = [asset_type_cat]
|
||||
cat_slug = ""
|
||||
for cat in asset_categories:
|
||||
cat = cat.title()
|
||||
cat_slug += "/" + cat
|
||||
if cat_slug not in ctx.poliigon_categories[asset_type_cat]:
|
||||
break
|
||||
category_list.append(cat)
|
||||
|
||||
return category_list
|
||||
|
||||
@staticmethod
|
||||
def _add_catalog(
|
||||
ctx: ScriptContext, asset_data: AssetData, entity: Any) -> bool:
|
||||
"""Assigns a catalog to an entity (object, collection, material,
|
||||
world,...).
|
||||
|
||||
If needed, the catalog file will be extended with additional catalogs
|
||||
based on the categories of the asset.
|
||||
"""
|
||||
|
||||
catalog_dict = POLIIGON_OT_sync_client._get_catalog_dict(ctx)
|
||||
asset_categories = POLIIGON_OT_sync_client._get_unique_category_list(
|
||||
ctx, asset_data)
|
||||
|
||||
# After this loop uuid_result contains the UUID of the leaf catalog
|
||||
for idx_cat, category in enumerate(asset_categories):
|
||||
category_path = "/".join(asset_categories[:idx_cat + 1])
|
||||
if category_path not in catalog_dict:
|
||||
uuid_result = POLIIGON_OT_sync_client._get_unique_uuid(
|
||||
catalog_dict)
|
||||
catalog_dict[category_path] = (uuid_result,
|
||||
category_path,
|
||||
category)
|
||||
else:
|
||||
uuid_result, _, _ = catalog_dict[category_path]
|
||||
|
||||
if not POLIIGON_OT_sync_client._write_catalog_file(
|
||||
ctx, catalog_dict):
|
||||
cTB.logger_ab.debug("add_catalog(): Failed to write catalog file")
|
||||
return False
|
||||
|
||||
# Finally assign the determined UUID to the entity
|
||||
entity.asset_data.catalog_id = uuid_result
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def _assign_asset_tags(
|
||||
asset_data: AssetData, entity: Any, params: Dict) -> None:
|
||||
"""Assigns tags to an entity (object, collection, material, world,...).
|
||||
|
||||
NOTE: This function requires entity.asset_mark() to be called
|
||||
beforehand.
|
||||
|
||||
Args:
|
||||
asset_data: AssetData
|
||||
params: Populated by host in function
|
||||
asset_browser.py:get_asset_job_parameters()
|
||||
"""
|
||||
|
||||
asset_name = asset_data.asset_name
|
||||
asset_display_name = asset_data.display_name
|
||||
asset_type = asset_data.asset_type
|
||||
|
||||
entity.asset_data.tags.new(asset_display_name)
|
||||
entity.asset_data.tags.new(asset_name) # unique name
|
||||
entity.asset_data.tags.new("Poliigon")
|
||||
for category in asset_data.categories:
|
||||
# TODO(Andreas): maybe we want to filter free?
|
||||
entity.asset_data.tags.new(category.title())
|
||||
|
||||
if asset_type == AssetType.HDRI:
|
||||
entity.asset_data.tags.new(params["size"])
|
||||
entity.asset_data.tags.new(params["size_bg"])
|
||||
elif asset_type == AssetType.MODEL:
|
||||
entity.asset_data.tags.new(params["size"])
|
||||
entity.asset_data.tags.new(params["lod"])
|
||||
elif asset_type == AssetType.TEXTURE:
|
||||
entity.asset_data.tags.new(params["size"])
|
||||
else:
|
||||
raise NotImplementedError(f"Unsupported asset type: {asset_type}")
|
||||
|
||||
@staticmethod
|
||||
def _assign_asset_preview(
|
||||
asset_data: AssetData, entity: Any, params: Dict) -> None:
|
||||
"""Assigns a preview image to an entity (object, collection, material,
|
||||
world,...).
|
||||
|
||||
NOTE: This function requires entity.asset_mark() to be called
|
||||
beforehand.
|
||||
|
||||
Args:
|
||||
asset_data: AssetData
|
||||
entity: Blender's object, collection, material, ...
|
||||
params: Populated by host in function
|
||||
asset_browser.py:get_asset_job_parameters()
|
||||
"""
|
||||
|
||||
path_thumb = params["thumb"]
|
||||
is_path = path_thumb is not None and len(path_thumb) > 2
|
||||
if is_path and os.path.exists(path_thumb):
|
||||
# From: https://blender.stackexchange.com/questions/6101/poll-failed-context-incorrect-example-bpy-ops-view3d-background-image-add
|
||||
# and: https://blender.stackexchange.com/questions/245397/batch-assign-pre-existing-image-files-as-asset-previews
|
||||
|
||||
# equal to: if bpy.app.version >= (3, 2, 0)
|
||||
if hasattr(bpy.context, "temp_override"):
|
||||
with bpy.context.temp_override(id=entity):
|
||||
bpy.ops.ed.lib_id_load_custom_preview(
|
||||
filepath=path_thumb)
|
||||
else:
|
||||
bpy.ops.ed.lib_id_load_custom_preview(
|
||||
{"id": entity}, filepath=path_thumb)
|
||||
else:
|
||||
# TODO(Andreas): Not working as expected
|
||||
# Maybe https://developer.blender.org/T93893 ?
|
||||
entity.asset_generate_preview()
|
||||
|
||||
@staticmethod
|
||||
def _assign_asset_meta_data(ctx: ScriptContext,
|
||||
asset_data: AssetData,
|
||||
entity: Any,
|
||||
params: Dict) -> bool:
|
||||
"""Assigns all meta data (e.g. author, tags, preview, catalog...) to an
|
||||
entity (object, collection, material, world,...).
|
||||
|
||||
Args:
|
||||
ctx: ScriptContext instance created upon script start
|
||||
asset_data: AssetData
|
||||
entity: Blender's object, collection, material, ...
|
||||
params: Populated by host in function
|
||||
asset_browser.py:get_asset_job_parameters()
|
||||
"""
|
||||
|
||||
if hasattr(entity, "type"):
|
||||
type_label = f", type: {entity.type}"
|
||||
elif isinstance(entity, bpy.types.Material):
|
||||
type_label = ", type: Material"
|
||||
else:
|
||||
type_label = ", type: UNKNOWN"
|
||||
cTB.logger_ab.debug(f"Marking {entity.name} {type_label}")
|
||||
|
||||
entity.asset_mark()
|
||||
|
||||
entity.asset_data.author = "Poliigon"
|
||||
entity.asset_data.description = asset_data.display_name
|
||||
|
||||
try:
|
||||
POLIIGON_OT_sync_client._assign_asset_tags(
|
||||
asset_data, entity, params)
|
||||
except NotImplementedError:
|
||||
cTB.logger_ab.exception("Unsupported Asset Type")
|
||||
return False
|
||||
POLIIGON_OT_sync_client._assign_asset_preview(
|
||||
asset_data, entity, params)
|
||||
if not POLIIGON_OT_sync_client._add_catalog(ctx, asset_data, entity):
|
||||
cTB.logger_ab.debug(
|
||||
"assign_asset_meta_data(): Failed to add catalog")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def _process_hdri(
|
||||
ctx: ScriptContext, asset_data: AssetData, params: Dict) -> bool:
|
||||
"""Processes an HDRI asset.
|
||||
|
||||
Args:
|
||||
ctx: ScriptContext instance created upon script start
|
||||
asset_data: An asset data dict passed down from P4B host.
|
||||
params: Populated by host in function
|
||||
asset_browser.py:get_asset_job_parameters()
|
||||
"""
|
||||
|
||||
if "size" not in params or "thumb" not in params:
|
||||
cTB.logger_ab.debug(
|
||||
"Missing required parameter (size and/or thumb) to process "
|
||||
"HDRI")
|
||||
return False
|
||||
|
||||
asset_id = asset_data.asset_id
|
||||
asset_name = asset_data.asset_name
|
||||
size = params["size"]
|
||||
size_bg = params["size_bg"]
|
||||
|
||||
cTB.logger_ab.debug(f"process_hdri {asset_name} {size}")
|
||||
try:
|
||||
result = bpy.ops.poliigon.poliigon_hdri(
|
||||
asset_id=asset_id,
|
||||
size=size,
|
||||
size_bg=size_bg)
|
||||
except Exception:
|
||||
cTB.logger_ab.exception("HDRI ERROR")
|
||||
return False
|
||||
|
||||
if result != {"FINISHED"}:
|
||||
return False
|
||||
|
||||
# Rename world,
|
||||
# otherwise the asset would appear as "World" in the Asset Browser.
|
||||
world = bpy.context.scene.world
|
||||
world.name = asset_name
|
||||
|
||||
if not POLIIGON_OT_sync_client._assign_asset_meta_data(
|
||||
ctx, asset_data, world, params):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def _process_model(
|
||||
ctx: ScriptContext, asset_data: AssetData, params: Dict) -> bool:
|
||||
"""Processes a Model asset.
|
||||
|
||||
Args:
|
||||
ctx: ScriptContext instance created upon script start
|
||||
asset_data: AssetData
|
||||
params: Populated by host in function
|
||||
asset_browser.py:get_asset_job_parameters()
|
||||
"""
|
||||
|
||||
has_size = "size" in params
|
||||
has_lod = "lod" in params
|
||||
has_thumb = "thumb" in params
|
||||
if not has_size or not has_lod or not has_thumb:
|
||||
cTB.logger_ab.debug(
|
||||
"Missing required parameter (size, lod and/or thumb) to "
|
||||
"process Model")
|
||||
return False
|
||||
|
||||
asset_id = asset_data.asset_id
|
||||
asset_name = asset_data.asset_name
|
||||
|
||||
size = params["size"]
|
||||
lod = params["lod"]
|
||||
|
||||
cTB.logger_ab.debug(f"process_model {asset_name} {size} {lod}")
|
||||
|
||||
try:
|
||||
result = bpy.ops.poliigon.poliigon_model(
|
||||
asset_id=asset_id,
|
||||
size=size,
|
||||
lod=lod,
|
||||
do_use_collection=True,
|
||||
do_link_blend=True,
|
||||
do_reuse_materials=False)
|
||||
except Exception:
|
||||
cTB.logger_ab.exception("MODEL ERROR")
|
||||
return False
|
||||
|
||||
if result != {"FINISHED"}:
|
||||
cTB.logger_ab.error("LOAD FAILURE")
|
||||
return False
|
||||
|
||||
# Mark the object instancing our collection
|
||||
found = False
|
||||
error = False
|
||||
for obj in bpy.data.objects:
|
||||
if obj.type != "EMPTY":
|
||||
continue
|
||||
if obj.parent is not None:
|
||||
continue
|
||||
if not obj.name.startswith(asset_name):
|
||||
continue
|
||||
if obj.instance_collection is None:
|
||||
continue
|
||||
|
||||
if POLIIGON_OT_sync_client._assign_asset_meta_data(
|
||||
ctx, asset_data, obj, params):
|
||||
found = True
|
||||
else:
|
||||
error = True
|
||||
break
|
||||
return found and not error
|
||||
|
||||
@staticmethod
|
||||
def _process_texture(
|
||||
ctx: ScriptContext, asset_data: AssetData, params: Dict) -> bool:
|
||||
"""Processes a Texture asset (including backplates and backdrops).
|
||||
|
||||
Args:
|
||||
ctx: ScriptContext instance created upon script start
|
||||
asset_data: AssetData
|
||||
params: Populated by host in function
|
||||
asset_browser.py:get_asset_job_parameters()
|
||||
"""
|
||||
|
||||
if "size" not in params or "thumb" not in params:
|
||||
cTB.logger_ab.debug(
|
||||
"Missing required parameter (size and/or thumb) to process "
|
||||
"Texture")
|
||||
return False
|
||||
|
||||
asset_id = asset_data.asset_id
|
||||
asset_name = asset_data.asset_name
|
||||
size = params["size"]
|
||||
|
||||
cTB.logger_ab.debug(f"process_texture {asset_name} {size}")
|
||||
|
||||
try:
|
||||
result = bpy.ops.poliigon.poliigon_material(
|
||||
asset_id=asset_id, size=size)
|
||||
except Exception:
|
||||
cTB.logger_ab.exception("MATERIAL ERROR")
|
||||
return False
|
||||
|
||||
if result != {"FINISHED"}:
|
||||
cTB.logger_ab.debug(
|
||||
f"Operator poliigon_material returned: {result}")
|
||||
return False
|
||||
|
||||
found = False
|
||||
error = False
|
||||
for mat in bpy.data.materials:
|
||||
if not mat.name.startswith(asset_name):
|
||||
continue
|
||||
|
||||
if POLIIGON_OT_sync_client._assign_asset_meta_data(
|
||||
ctx, asset_data, mat, params):
|
||||
found = True
|
||||
else:
|
||||
error = True
|
||||
cTB.logger_ab.debug(
|
||||
f"Failed to assign meta data to material: {mat.name}")
|
||||
break
|
||||
|
||||
if not found:
|
||||
cTB.logger_ab.debug("Found no entity to mark")
|
||||
|
||||
return found and not error
|
||||
|
||||
@staticmethod
|
||||
def _process_asset(
|
||||
ctx: ScriptContext, asset_data: AssetData, params: Dict) -> bool:
|
||||
"""Creates and saves an Asset Browser-marked asset to a new blend file.
|
||||
|
||||
Args:
|
||||
ctx: ScriptContext instance created upon script start
|
||||
asset_data: AssetData
|
||||
params: Populated by host in function
|
||||
asset_browser.py:get_asset_job_parameters()
|
||||
"""
|
||||
|
||||
if "path_result" not in params:
|
||||
cTB.logger_ab.debug("process_asset(): Lacking result path!")
|
||||
return False
|
||||
path_result = params["path_result"]
|
||||
|
||||
POLIIGON_OT_sync_client._reset_blend()
|
||||
|
||||
asset_name = asset_data.asset_name
|
||||
asset_type = asset_data.asset_type
|
||||
|
||||
cTB.logger_ab.debug(f"process_asset() {asset_name}")
|
||||
for _param, value in params.items():
|
||||
cTB.logger_ab.debug(f" {_param} {value}")
|
||||
|
||||
if asset_type == AssetType.HDRI:
|
||||
result = POLIIGON_OT_sync_client._process_hdri(
|
||||
ctx, asset_data, params)
|
||||
elif asset_type == AssetType.MODEL:
|
||||
result = POLIIGON_OT_sync_client._process_model(
|
||||
ctx, asset_data, params)
|
||||
elif asset_type == AssetType.TEXTURE:
|
||||
result = POLIIGON_OT_sync_client._process_texture(
|
||||
ctx, asset_data, params)
|
||||
else:
|
||||
cTB.logger_ab.debug("process_asset(): Unknown asset type")
|
||||
return False
|
||||
|
||||
if result:
|
||||
result = POLIIGON_OT_sync_client._save_blend(path_result)
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def _cmd_asset(ctx: ScriptContext, cmd: SyncAssetBrowserCmd) -> None:
|
||||
"""Handle an ASSET command"""
|
||||
|
||||
asset_id = cmd.data["asset_id"]
|
||||
asset_data = cTB._asset_index.get_asset(asset_id)
|
||||
if asset_data is None:
|
||||
bpy.ops.poliigon.get_local_asset_sync(
|
||||
await_startup_poliigon=False,
|
||||
await_startup_my_assets=False,
|
||||
get_poliigon=False,
|
||||
get_my_assets=True,
|
||||
asset_id=asset_id,
|
||||
abort_ongoing_jobs=False)
|
||||
asset_data = cTB._asset_index.get_asset(asset_id)
|
||||
|
||||
if asset_data is not None:
|
||||
result = POLIIGON_OT_sync_client._process_asset(
|
||||
ctx, asset_data, cmd.params)
|
||||
else:
|
||||
cTB.logger_ab.error(f"process_asset(): No asset data for {asset_id}.")
|
||||
result = False
|
||||
|
||||
if result:
|
||||
ctx.queue_send.put(SyncAssetBrowserCmd(code=SyncCmd.ASSET_OK,
|
||||
data=cmd.data))
|
||||
else:
|
||||
ctx.queue_send.put(SyncAssetBrowserCmd(code=SyncCmd.ASSET_ERROR,
|
||||
data=cmd.data))
|
||||
cTB.logger_ab.debug(f"cmd_asset exit {asset_data.asset_name}")
|
||||
|
||||
@staticmethod
|
||||
def _cmd_hello_ok(ctx: ScriptContext, cmd: SyncAssetBrowserCmd) -> None:
|
||||
"""Handle a HELLO_OK command"""
|
||||
|
||||
cTB.logger_ab.debug("cmd_hello_ok")
|
||||
ctx.queue_ack.put(SyncAssetBrowserCmd(code=SyncCmd.CMD_DONE))
|
||||
|
||||
@staticmethod
|
||||
def _cmd_still_there(ctx: ScriptContext, cmd: SyncAssetBrowserCmd) -> None:
|
||||
"""Handle a STILL_THERE command"""
|
||||
|
||||
cTB.logger_ab.debug("cmd_still_there")
|
||||
bpy.ops.poliigon.get_local_asset_sync(
|
||||
await_startup_poliigon=False,
|
||||
await_startup_my_assets=True,
|
||||
get_poliigon=False,
|
||||
get_my_assets=False,
|
||||
abort_ongoing_jobs=False)
|
||||
ctx.queue_send.put(SyncAssetBrowserCmd(code=SyncCmd.HELLO))
|
||||
|
||||
@staticmethod
|
||||
def _cmd_exit(ctx: ScriptContext, cmd: SyncAssetBrowserCmd) -> None:
|
||||
"""Handle an EXIT command"""
|
||||
|
||||
cTB.logger_ab.debug("cmd_exit")
|
||||
# Notify host, we are going to exit
|
||||
ctx.queue_send.put(SyncAssetBrowserCmd(code=SyncCmd.EXIT_ACK))
|
||||
# Tear down everything
|
||||
ctx.listener_running = False
|
||||
if ctx.thd_listener is not None:
|
||||
ctx.thd_listener.join()
|
||||
if ctx.thd_sender is not None:
|
||||
ctx.thd_sender.join()
|
||||
ctx.main_running = False
|
||||
|
||||
def _init_settings(self) -> None:
|
||||
"""Changes settings to what is needed by sync client
|
||||
(backing up the original settings).
|
||||
"""
|
||||
|
||||
self.settings_backup = {}
|
||||
for _key in ["download_prefer_blend",
|
||||
"download_link_blend"]:
|
||||
self.settings_backup[_key] = cTB.settings[_key]
|
||||
|
||||
cTB.settings["download_prefer_blend"] = 1
|
||||
cTB.settings["download_link_blend"] = 0
|
||||
|
||||
def _restore_settings(self) -> None:
|
||||
"""Restores settings from backup."""
|
||||
|
||||
for _key in ["download_prefer_blend",
|
||||
"download_link_blend"]:
|
||||
cTB.settings[_key] = self.settings_backup[_key]
|
||||
|
||||
@reporting.handle_operator(silent=True)
|
||||
def execute(self, context):
|
||||
has_catalog = self.path_catalog is not None
|
||||
has_categories = self.path_categories is not None
|
||||
if not has_catalog or not has_categories:
|
||||
return {"CANCELLED"}
|
||||
|
||||
ctx = ScriptContext(
|
||||
path_cat=self.path_catalog,
|
||||
path_categories=self.path_categories)
|
||||
|
||||
if not self._startup(ctx):
|
||||
return {"CANCELLED"}
|
||||
|
||||
ctx.main_running = True
|
||||
while ctx.main_running:
|
||||
try:
|
||||
cmd_recv = ctx.queue_cmd.get(timeout=1.0)
|
||||
ctx.queue_cmd.task_done()
|
||||
except Empty:
|
||||
continue
|
||||
|
||||
if cmd_recv is None:
|
||||
continue
|
||||
|
||||
if cmd_recv.code == SyncCmd.EXIT:
|
||||
self._cmd_exit(ctx, cmd_recv)
|
||||
elif cmd_recv.code == SyncCmd.ASSET:
|
||||
self._init_settings()
|
||||
self._cmd_asset(ctx, cmd_recv)
|
||||
self._restore_settings()
|
||||
elif cmd_recv.code == SyncCmd.STILL_THERE:
|
||||
self._cmd_still_there(ctx, cmd_recv)
|
||||
elif cmd_recv.code == SyncCmd.HELLO_OK:
|
||||
self._cmd_hello_ok(ctx, cmd_recv)
|
||||
|
||||
return {"FINISHED"}
|
||||
@@ -0,0 +1,90 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from threading import Thread
|
||||
|
||||
from bpy.types import Operator
|
||||
from bpy.props import (
|
||||
BoolProperty,
|
||||
IntProperty
|
||||
)
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from ..constants import ASSET_ID_ALL
|
||||
from ..toolbox import get_context
|
||||
from .. import reporting
|
||||
from . import asset_browser as ab
|
||||
|
||||
|
||||
class POLIIGON_OT_update_asset_browser(Operator):
|
||||
bl_idname = "poliigon.update_asset_browser"
|
||||
bl_label = _t("Sync Local Assets")
|
||||
bl_category = "Poliigon"
|
||||
bl_description = _t("Synchronize local assets with Asset Browser")
|
||||
bl_options = {"INTERNAL"}
|
||||
|
||||
asset_id: IntProperty(options={"HIDDEN"}, default=ASSET_ID_ALL) # noqa: F821
|
||||
force: BoolProperty(options={"HIDDEN"}, default=False) # noqa: F821
|
||||
|
||||
@staticmethod
|
||||
def init_context(addon_version: str) -> None:
|
||||
"""Called from operators.py to init global addon context."""
|
||||
|
||||
global cTB
|
||||
cTB = get_context(addon_version)
|
||||
|
||||
@reporting.handle_operator(silent=True)
|
||||
def execute(self, context):
|
||||
if bpy.app.version < (3, 0):
|
||||
self.report(
|
||||
{"ERROR"},
|
||||
"Asset browser not available in this blender version")
|
||||
return {"CANCELLED"}
|
||||
|
||||
bpy.ops.poliigon.get_local_asset_sync(
|
||||
await_startup_poliigon=False,
|
||||
await_startup_my_assets=False,
|
||||
get_poliigon=False,
|
||||
get_my_assets=True,
|
||||
asset_id=self.asset_id)
|
||||
|
||||
if self.asset_id != ASSET_ID_ALL:
|
||||
asset_ids = cTB._asset_index.get_asset_id_list(
|
||||
purchased=True, local=True)
|
||||
if self.asset_id not in asset_ids:
|
||||
self.report(
|
||||
{"ERROR"},
|
||||
f"Asset ID {self.asset_id} not found")
|
||||
return {"CANCELLED"}
|
||||
|
||||
if ab.create_poliigon_library() is None:
|
||||
cTB.logger_ab.debug("HOST: No Poliigon library in Asset Browser!")
|
||||
error_msg = "No Poliigon library in Asset."
|
||||
reporting.capture_message(
|
||||
"asset_browser_no_polii_lib", error_msg, "error")
|
||||
self.report({"ERROR"}, error_msg)
|
||||
return {"CANCELLED"}
|
||||
|
||||
thd_init_sync = Thread(
|
||||
target=ab.thread_initiate_asset_synchronization,
|
||||
args=(self.asset_id, self.force, ))
|
||||
thd_init_sync.start()
|
||||
cTB.threads.append(thd_init_sync)
|
||||
|
||||
return {"FINISHED"}
|
||||
@@ -0,0 +1,47 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import bpy
|
||||
|
||||
from .asset_browser_operator_import import POLIIGON_OT_asset_browser_import
|
||||
from .asset_browser_operator_quick_menu import POLIIGON_OT_asset_browser_quick_menu
|
||||
from .asset_browser_operator_reprocess import POLIIGON_OT_asset_browser_reprocess
|
||||
from .asset_browser_operator_sync_cancel import POLIIGON_OT_cancel_asset_browser_sync
|
||||
from .asset_browser_operator_sync_client import POLIIGON_OT_sync_client
|
||||
from .asset_browser_operator_update import POLIIGON_OT_update_asset_browser
|
||||
|
||||
|
||||
classes = (
|
||||
POLIIGON_OT_update_asset_browser,
|
||||
POLIIGON_OT_cancel_asset_browser_sync,
|
||||
POLIIGON_OT_asset_browser_import,
|
||||
POLIIGON_OT_asset_browser_quick_menu,
|
||||
POLIIGON_OT_asset_browser_reprocess,
|
||||
POLIIGON_OT_sync_client
|
||||
)
|
||||
|
||||
|
||||
def register(addon_version: str):
|
||||
for cls in classes:
|
||||
bpy.utils.register_class(cls)
|
||||
cls.init_context(addon_version)
|
||||
|
||||
|
||||
def unregister():
|
||||
for cls in reversed(classes):
|
||||
bpy.utils.unregister_class(cls)
|
||||
@@ -0,0 +1,129 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
"""Standalone blender startup script used to generate asset blend files.
|
||||
|
||||
General Asset Browser links:
|
||||
Asset Catalogs: https://wiki.blender.org/wiki/Source/Architecture/Asset_System/Catalogs
|
||||
Asset Operators: https://docs.blender.org/api/current/bpy.ops.asset.html
|
||||
AssetMetaData: https://docs.blender.org/api/current/bpy.types.AssetMetaData.html#bpy.types.AssetMetaData
|
||||
Catalogs and save: https://blender.stackexchange.com/questions/284833/get-asset-browser-catalogs-in-case-of-unsaved-changes
|
||||
|
||||
Operator overriding:
|
||||
https://blender.stackexchange.com/questions/248274/a-comprehensive-list-of-operator-overrides
|
||||
https://blender.stackexchange.com/questions/129989/override-context-for-operator-called-from-panel
|
||||
https://blender.stackexchange.com/questions/182713/how-to-use-context-override-on-the-disable-and-keep-transform-operator
|
||||
https://blender.stackexchange.com/questions/273474/how-to-override-context-to-launch-ops-commands-in-text-editor-3-2
|
||||
https://blender.stackexchange.com/questions/875/proper-bpy-ops-context-setup-in-a-plugin
|
||||
|
||||
Asset browser related, not much use in here:
|
||||
https://blender.stackexchange.com/questions/262284/how-do-i-access-the-list-of-selected-assets-from-an-event-in-python
|
||||
https://blender.stackexchange.com/questions/261213/get-the-source-path-of-the-assets-in-asset-browser-using-python
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import os
|
||||
import sys
|
||||
|
||||
import bpy
|
||||
|
||||
from typing import Tuple
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(__file__)))
|
||||
|
||||
from modules.poliigon_core.multilingual import _t # noqa: E402
|
||||
from constants import ADDON_NAME # noqa: E402
|
||||
|
||||
|
||||
DEBUG_CLIENT = False
|
||||
|
||||
|
||||
def print_debug(*args, file=sys.stdout) -> None:
|
||||
"""Use for printing in client script"""
|
||||
|
||||
if not DEBUG_CLIENT:
|
||||
return
|
||||
print(" C:", *args, file=file)
|
||||
|
||||
|
||||
def command_line_args() -> Tuple[str, str]:
|
||||
"""Parses command line args."""
|
||||
|
||||
path_catalog = None
|
||||
path_categories = None
|
||||
|
||||
# Skip Blender's own command line args
|
||||
argv = sys.argv
|
||||
try:
|
||||
idx_arg = argv.index("--") + 1
|
||||
except ValueError:
|
||||
idx_arg = None
|
||||
if idx_arg is None or idx_arg >= len(argv):
|
||||
return None, None
|
||||
|
||||
argv = argv[idx_arg:]
|
||||
|
||||
try:
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument("-pcf", "--poliigon_cat_file",
|
||||
help=_t("Path to catalog file"),
|
||||
required=True)
|
||||
parser.add_argument("-pc", "--poliigon_categories",
|
||||
help=_t("Path to file with Poliigon categories"),
|
||||
required=True)
|
||||
args = parser.parse_args(argv)
|
||||
except Exception as e:
|
||||
print_debug(e)
|
||||
|
||||
if args.poliigon_cat_file:
|
||||
path_catalog = args.poliigon_cat_file
|
||||
else:
|
||||
print_debug(
|
||||
"Lacking path to Blender cat file in commandline arguments!")
|
||||
return None, None
|
||||
|
||||
if args.poliigon_categories:
|
||||
path_categories = args.poliigon_categories
|
||||
else:
|
||||
print_debug(
|
||||
"Lacking path to Poliigon categories file in commandline "
|
||||
"arguments!")
|
||||
return None, None
|
||||
|
||||
return path_catalog, path_categories
|
||||
|
||||
|
||||
def main():
|
||||
print_debug("Hello Blender host, I am the client")
|
||||
|
||||
path_catalog, path_categories = command_line_args()
|
||||
has_catalog = path_catalog is not None
|
||||
has_categories = path_categories is not None
|
||||
if not has_catalog or not has_categories:
|
||||
print_debug("Missing catalog or categories path.")
|
||||
return
|
||||
|
||||
bpy.ops.preferences.addon_enable(module=ADDON_NAME)
|
||||
|
||||
bpy.ops.poliigon.asset_browser_sync_client(
|
||||
path_catalog=path_catalog, path_categories=path_categories)
|
||||
|
||||
print_debug("Subprocess exit")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -0,0 +1,112 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
from dataclasses import asdict, dataclass
|
||||
from enum import IntEnum
|
||||
import hashlib
|
||||
import json
|
||||
import sys
|
||||
from typing import Dict, Optional
|
||||
|
||||
# FILE_H2C belongs to proc (proc.stdin)
|
||||
FILE_C2H = sys.stderr
|
||||
|
||||
|
||||
CMD_MARKER_START = "POLIIGON_CMD_START\n"
|
||||
CMD_MARKER_END = "POLIIGON_CMD_END\n"
|
||||
|
||||
|
||||
class SyncCmd(IntEnum):
|
||||
"""Command codes"""
|
||||
|
||||
HELLO = 0 # Client -> Host
|
||||
HELLO_OK = 1 # Host -> Client (ack)
|
||||
ASSET = 2 # Host -> Client
|
||||
ASSET_OK = 3 # Client -> Host (ack)
|
||||
ASSET_ERROR = 4 # Client -> Host (ack)
|
||||
EXIT = 5 # Host -> Client
|
||||
EXIT_ACK = 6 # Client -> Host (ack)
|
||||
CMD_DONE = 7 # internal
|
||||
CMD_ERROR = 8 # both directions (ack)
|
||||
STILL_THERE = 9 # Host -> Client
|
||||
|
||||
|
||||
@dataclass
|
||||
class SyncAssetBrowserCmd():
|
||||
"""Command to be transmitted between host and client"""
|
||||
|
||||
code: SyncCmd
|
||||
data: Optional[Dict] = None
|
||||
params: Optional[Dict] = None
|
||||
checksum: Optional[str] = ""
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, buf: str):
|
||||
"""Alternate constructor, used after receiving a command."""
|
||||
|
||||
cmd_dict = json.loads(buf)
|
||||
if "code" not in cmd_dict:
|
||||
raise KeyError("code")
|
||||
new = cls(**cmd_dict)
|
||||
new.code = SyncCmd(new.code)
|
||||
|
||||
cmd_is_ok = new.check_checksum()
|
||||
if not cmd_is_ok:
|
||||
raise RuntimeError("Checksum error")
|
||||
return new
|
||||
|
||||
def check_checksum(self) -> bool:
|
||||
checksum_to_test = self.checksum
|
||||
self.checksum = ""
|
||||
cmd_dict = asdict(self)
|
||||
json_str = json.dumps(cmd_dict, indent=4, default=vars) + "\n"
|
||||
checksum_calculated = hashlib.md5(json_str.encode("utf-8")).hexdigest()
|
||||
return checksum_to_test == checksum_calculated
|
||||
|
||||
def to_json(self) -> str:
|
||||
cmd_dict = asdict(self)
|
||||
json_str = json.dumps(cmd_dict, indent=4, default=vars) + "\n"
|
||||
return json_str
|
||||
|
||||
def calc_checksum(self) -> None:
|
||||
self.checksum = ""
|
||||
json_str = self.to_json()
|
||||
try:
|
||||
self.checksum = hashlib.md5(json_str.encode("utf-8")).hexdigest()
|
||||
except Exception as e:
|
||||
print(f"MD5 error: {e}")
|
||||
|
||||
def prepare_send(self) -> str:
|
||||
self.calc_checksum()
|
||||
json_str = CMD_MARKER_START
|
||||
json_str += self.to_json()
|
||||
json_str += CMD_MARKER_END
|
||||
return json_str
|
||||
|
||||
def send_to_process(self, proc) -> None: # use on host
|
||||
try:
|
||||
proc.stdin.write(self.prepare_send())
|
||||
proc.stdin.flush()
|
||||
except Exception:
|
||||
# Deliberately silencing exceptions here.
|
||||
# Any exceptions regarding unexpectedly closed handles are handled
|
||||
# in respective threads instead.
|
||||
pass
|
||||
|
||||
def send_to_stdio(self, file=FILE_C2H) -> None: # use on client
|
||||
file.write(self.prepare_send())
|
||||
file.flush()
|
||||
@@ -0,0 +1,273 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
from typing import Optional
|
||||
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from .asset_browser import (
|
||||
get_num_selected_assets,
|
||||
is_asset_browser,
|
||||
is_only_poliigon_selected,
|
||||
is_poliigon_library,
|
||||
t_status_bar_update)
|
||||
from ..constants import ASSET_ID_ALL
|
||||
from ..toolbox import get_context
|
||||
from .. import reporting
|
||||
|
||||
|
||||
def build_asset_browser_progress(ui: bpy.types.Panel,
|
||||
context: bpy.context,
|
||||
layout: Optional[bpy.types.UILayout] = None,
|
||||
show_label: bool = True,
|
||||
show_cancel: bool = True,
|
||||
show_second_line: bool = False) -> None:
|
||||
|
||||
if layout is None:
|
||||
layout = ui.layout
|
||||
|
||||
if cTB.num_asset_browser_jobs == 0 and not cTB.blender_client_starting:
|
||||
return
|
||||
elif cTB.blender_client_starting:
|
||||
col = layout.column()
|
||||
row = col.row(align=True)
|
||||
row.label(text=_t("Blender client starting up..."))
|
||||
return
|
||||
|
||||
num_jobs_done = cTB.num_jobs_error + cTB.num_jobs_ok
|
||||
progress = num_jobs_done / cTB.num_asset_browser_jobs
|
||||
progress = max(0.01, progress)
|
||||
done = num_jobs_done == cTB.num_asset_browser_jobs
|
||||
|
||||
layout.separator()
|
||||
|
||||
col = layout.column()
|
||||
row = col.row(align=True)
|
||||
|
||||
if show_label and done:
|
||||
row.label(text=_t("Asset Browser Synchronization : "))
|
||||
elif show_label and not done:
|
||||
row.label(text=_t("Asset Browser Synchronization : {0:.1%}").format(
|
||||
progress))
|
||||
|
||||
if done:
|
||||
if cTB.num_jobs_error:
|
||||
text = _t("Finished {0} assets, {1} errors").format(
|
||||
cTB.num_asset_browser_jobs, cTB.num_jobs_error)
|
||||
row.label(text=text,
|
||||
icon="ERROR")
|
||||
else:
|
||||
text = _t("Successfully finished {0} assets").format(
|
||||
cTB.num_asset_browser_jobs)
|
||||
row.label(text=text,
|
||||
icon="CHECKMARK")
|
||||
return
|
||||
|
||||
tooltip_progress = _t("Processing assets: {0} of {1} assets done").format(
|
||||
num_jobs_done, cTB.num_asset_browser_jobs)
|
||||
|
||||
split = row.split(factor=progress, align=True)
|
||||
op = split.operator(
|
||||
"poliigon.poliigon_setting", text="", emboss=1, depress=1
|
||||
)
|
||||
op.mode = "none"
|
||||
op.tooltip = tooltip_progress
|
||||
|
||||
op = split.operator(
|
||||
"poliigon.poliigon_setting", text="", emboss=1, depress=0
|
||||
)
|
||||
op.mode = "none"
|
||||
op.tooltip = tooltip_progress
|
||||
|
||||
if show_cancel:
|
||||
op = row.operator(
|
||||
"poliigon.cancel_asset_browser",
|
||||
text="",
|
||||
emboss=True,
|
||||
depress=0,
|
||||
icon="X"
|
||||
)
|
||||
if cTB.asset_browser_jobs_cancelled:
|
||||
row.enabled = False
|
||||
|
||||
if show_second_line:
|
||||
text = _t("Processed {0} of {1} assets.").format(
|
||||
num_jobs_done, cTB.num_asset_browser_jobs)
|
||||
col.label(text=text)
|
||||
|
||||
|
||||
class POLIIGON_PT_sidebar_left(bpy.types.Panel):
|
||||
bl_label = "Poliigon"
|
||||
bl_space_type = "FILE_BROWSER"
|
||||
bl_region_type = "TOOLS"
|
||||
bl_options = {"HIDE_HEADER"}
|
||||
|
||||
view_screen_tracked = False
|
||||
|
||||
@classmethod
|
||||
def poll(self, context):
|
||||
if not cTB.is_logged_in():
|
||||
return False
|
||||
if not is_asset_browser(context):
|
||||
return False
|
||||
if not is_poliigon_library(context, incl_all_libs=False):
|
||||
return False
|
||||
return True
|
||||
|
||||
@reporting.handle_draw()
|
||||
def draw(self, context):
|
||||
cTB._api._mp_relevant = True
|
||||
|
||||
if not self.view_screen_tracked:
|
||||
# TODO(patrick): value not retained, re-triggering on future draws
|
||||
self.view_screen_tracked = True
|
||||
cTB.track_screen("blend_browser_lib")
|
||||
|
||||
layout = self.layout
|
||||
box = layout.box()
|
||||
col = box.column(align=True)
|
||||
|
||||
name_is_set = cTB.prefs.asset_browser_library_name != ""
|
||||
directory_is_set = cTB.get_library_path(primary=True) not in [None, ""]
|
||||
sync_options_enabled = name_is_set and directory_is_set
|
||||
|
||||
if cTB.num_asset_browser_jobs == 0 and not cTB.lock_client_start.locked():
|
||||
col.label(text="Poliigon assets:")
|
||||
row_manual_sync = col.row(align=1)
|
||||
op_manual_sync = row_manual_sync.operator(
|
||||
"poliigon.update_asset_browser",
|
||||
text=_t("Synchronize Local Assets"),
|
||||
emboss=True,
|
||||
icon="FILE_REFRESH",
|
||||
)
|
||||
op_manual_sync.asset_id = ASSET_ID_ALL
|
||||
row_manual_sync.enabled = sync_options_enabled
|
||||
else:
|
||||
col.label(text=_t("Poliigon Asset Browser Synchronization"))
|
||||
build_asset_browser_progress(self,
|
||||
context,
|
||||
col,
|
||||
show_label=False,
|
||||
show_second_line=True)
|
||||
|
||||
|
||||
class POLIIGON_PT_sidebar_right(bpy.types.Panel):
|
||||
bl_label = _t("Poliigon in Asset Browser")
|
||||
bl_space_type = "FILE_BROWSER"
|
||||
bl_region_type = "TOOL_PROPS" # right side panel
|
||||
bl_options = {"HEADER_LAYOUT_EXPAND"}
|
||||
|
||||
view_screen_tracked = False
|
||||
|
||||
@classmethod
|
||||
def poll(self, context):
|
||||
if not cTB.is_logged_in():
|
||||
return False
|
||||
if not is_asset_browser(context):
|
||||
return False
|
||||
if not is_poliigon_library(context):
|
||||
return False
|
||||
if not is_only_poliigon_selected(context):
|
||||
return False
|
||||
return True
|
||||
|
||||
@reporting.handle_draw()
|
||||
def draw(self, context):
|
||||
if not is_poliigon_library(context):
|
||||
return
|
||||
|
||||
if not self.view_screen_tracked:
|
||||
cTB.track_screen("blend_browser_import")
|
||||
self.view_screen_tracked = True
|
||||
|
||||
num_selected = get_num_selected_assets(context)
|
||||
if num_selected == 1:
|
||||
label_import = _t("Import Asset (TODO)")
|
||||
label_reprocess = _t("Re-process Asset")
|
||||
elif num_selected > 1:
|
||||
label_import = _t("Import {0} Assets (TODO)").format(num_selected)
|
||||
label_reprocess = _t("Re-process {0} Assets").format(num_selected)
|
||||
else:
|
||||
label_import = _t("No Asset Selected")
|
||||
label_reprocess = _t("Re-process Asset")
|
||||
|
||||
layout = self.layout
|
||||
col = layout.column()
|
||||
row = col.row()
|
||||
row.operator("poliigon.asset_browser_reprocess",
|
||||
text=label_reprocess,
|
||||
icon="FILE_REFRESH")
|
||||
|
||||
col.separator()
|
||||
|
||||
if not (cTB._env.env_name and "dev" in cTB._env.env_name.lower()):
|
||||
return
|
||||
|
||||
# TODO(Andreas): Import button currently in dev environment, only
|
||||
row = col.row(align=True)
|
||||
row.operator(
|
||||
"poliigon.asset_browser_import",
|
||||
text=label_import,
|
||||
emboss=True,
|
||||
)
|
||||
row.operator(
|
||||
"poliigon.asset_browser_quick_menu",
|
||||
text="",
|
||||
icon="TRIA_DOWN",
|
||||
)
|
||||
|
||||
|
||||
classes_prod = (
|
||||
POLIIGON_PT_sidebar_left,
|
||||
)
|
||||
|
||||
classes_dev = (
|
||||
POLIIGON_PT_sidebar_left,
|
||||
POLIIGON_PT_sidebar_right
|
||||
)
|
||||
|
||||
|
||||
classes = None
|
||||
cTB = None
|
||||
|
||||
|
||||
def register(addon_version: str):
|
||||
global classes
|
||||
global cTB
|
||||
|
||||
cTB = get_context(addon_version)
|
||||
|
||||
if cTB._env.env_name and "dev" in cTB._env.env_name.lower():
|
||||
classes = classes_dev
|
||||
else:
|
||||
classes = classes_prod
|
||||
|
||||
for cls in classes:
|
||||
bpy.utils.register_class(cls)
|
||||
|
||||
bpy.types.STATUSBAR_HT_header.prepend(build_asset_browser_progress)
|
||||
|
||||
|
||||
def unregister():
|
||||
if bpy.app.timers.is_registered(t_status_bar_update):
|
||||
bpy.app.timers.unregister(t_status_bar_update)
|
||||
|
||||
bpy.types.STATUSBAR_HT_header.remove(build_asset_browser_progress)
|
||||
|
||||
for cls in reversed(classes):
|
||||
bpy.utils.unregister_class(cls)
|
||||
@@ -0,0 +1,33 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
BUILD_OPTION_BOB = True
|
||||
BUILD_OPTION_P4B = not BUILD_OPTION_BOB
|
||||
ADDON_NAME = "polydex-blender"
|
||||
|
||||
NAME_PANEL = "Polydex"
|
||||
NAME_PANEL_CATEGORY = "Polydex"
|
||||
NAME_PREFS = "Polydex"
|
||||
PREFIX_MT = "BOB" # only in source
|
||||
PREFIX_OP = "bob" # only in source
|
||||
PREFIX_PT = "BOB" # only in source
|
||||
PREFIX_ICON = "BOB_" # only in source
|
||||
PREFIX_FILENAME_SETTINGS = "polydex_"
|
||||
PREFIX_FILENAME_BL_SETTINGS = "Polydex_"
|
||||
|
||||
URL_SENTRY = "https://060645be7fb64867bf9b8124b8e9a0f0@sentry.poliigon.com/12"
|
||||
@@ -0,0 +1,67 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
|
||||
ADDON_NAME = "poliigon-addon-blender"
|
||||
|
||||
SUPPORTED_CONVENTION = 1
|
||||
|
||||
URLS_BLENDER = {
|
||||
"survey": "https://www.surveymonkey.com/r/p4b-addon-ui-01",
|
||||
"survey_subscribed": "https://www.surveymonkey.com/r/p4b-addon-ui-02",
|
||||
"survey_free": "https://www.surveymonkey.com/r/p4b-addon-ui-03",
|
||||
"p4b": "https://poliigon.com/blender",
|
||||
"changelog": "https://poliigon.com/blender",
|
||||
"unlimited": "https://help.poliigon.com/en/articles/10003983-unlimited-plans",
|
||||
"terms_policy": "https://help.poliigon.com/en/articles/10567243-terms-policy-documents"
|
||||
}
|
||||
|
||||
ICONS = [ # tuples: (name, filename, type)
|
||||
("ICON_poliigon", "poliigon_logo.png", "IMAGE"),
|
||||
("ICON_asset_balance", "asset_balance.png", "IMAGE"),
|
||||
("ICON_myassets", "my_assets.png", "IMAGE"),
|
||||
("GET_preview", "get_preview.png", "IMAGE"),
|
||||
("NO_preview", "icon_nopreview.png", "IMAGE"),
|
||||
("ICON_dots", "icon_dots.png", "IMAGE"),
|
||||
("ICON_acquired_check", "acquired_checkmark.png", "IMAGE"),
|
||||
("ICON_plan_upgrade_check", "icon_plan_upgrade_check.png", "IMAGE"),
|
||||
("ICON_plan_upgrade_info", "icon_plan_upgrade_info.png", "IMAGE"),
|
||||
("ICON_plan_upgrade_unlimited", "icon_plan_upgrade_unlimited.png", "IMAGE"),
|
||||
("ICON_subscription_paused", "subscription_paused.png", "IMAGE"),
|
||||
("ICON_unlimited_local", "icon_unlimited_local.png", "IMAGE"),
|
||||
("LOGO_unlimited", "logo_unlimited.png", "IMAGE"),
|
||||
]
|
||||
|
||||
# TODO(Andreas): Not quite sure, why we do not need these in addon-core,
|
||||
# nor why these are different from SIZES
|
||||
HDRI_RESOLUTIONS = ["1K", "2K", "3K", "4K", "6K", "8K", "16K"]
|
||||
|
||||
|
||||
# Default asset ID 1000000 means fetch all IDs.
|
||||
# 1000000 to be expected outside of valid range.
|
||||
# Did not use -1 to leave the negative ID range for side imports.
|
||||
ASSET_ID_ALL = 1000000
|
||||
|
||||
# -6 on label width:
|
||||
# Needed to reduce a bit to avoid truncation on OSX 1x and 2x screens.
|
||||
POPUP_WIDTH = 250
|
||||
POPUP_WIDTH_LABEL = POPUP_WIDTH - 6
|
||||
POPUP_WIDTH_NARROW = 200
|
||||
POPUP_WIDTH_LABEL_NARROW = POPUP_WIDTH_NARROW - 6
|
||||
POPUP_WIDTH_WIDE = 400
|
||||
POPUP_WIDTH_LABEL_WIDE = POPUP_WIDTH_WIDE - 6
|
||||
@@ -0,0 +1,104 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from .utils_dlg import (
|
||||
get_ui_scale,
|
||||
wrapped_label)
|
||||
|
||||
|
||||
def _build_section_free_user(cTB) -> None:
|
||||
w_label = cTB.width_draw_ui - 20 * get_ui_scale(cTB)
|
||||
|
||||
box_free = cTB.vBase.box()
|
||||
col = box_free.column()
|
||||
|
||||
msg = _t("Access 3,000+ studio quality assets")
|
||||
wrapped_label(cTB, w_label, msg, col, add_padding=False)
|
||||
msg = _t("Unused asset balance rolls over each month")
|
||||
wrapped_label(cTB, w_label, msg, col, add_padding=False, icon="CHECKMARK")
|
||||
msg = _t("Commercial & personal use license")
|
||||
wrapped_label(cTB, w_label, msg, col, add_padding=False, icon="CHECKMARK")
|
||||
msg = _t("Redownload even if your subscription ends")
|
||||
wrapped_label(cTB, w_label, msg, col, add_padding=False, icon="CHECKMARK")
|
||||
msg = _t("Cancel or pause at any time in a few clicks")
|
||||
wrapped_label(cTB, w_label, msg, col, add_padding=False, icon="CHECKMARK")
|
||||
msg = _t("50% discount for students and teachers")
|
||||
wrapped_label(cTB, w_label, msg, col, add_padding=False, icon="CHECKMARK")
|
||||
|
||||
op = col.operator("poliigon.poliigon_link", text=_t("View Pricing"))
|
||||
op.mode = "subscribe"
|
||||
# TODO(Andreas): Figma did not contain any tooltips...
|
||||
op.tooltip = _t("View Poliigon Pricing Online")
|
||||
|
||||
|
||||
def _build_section_paid_plan(cTB) -> None:
|
||||
w_label = cTB.width_draw_ui - 20 * get_ui_scale(cTB)
|
||||
|
||||
box_free = cTB.vBase.box()
|
||||
col = box_free.column()
|
||||
|
||||
name_plan = cTB.user.plan.plan_name
|
||||
wrapped_label(cTB, w_label, name_plan, col, add_padding=False)
|
||||
|
||||
if not cTB.is_unlimited_user():
|
||||
credits = cTB.user.plan.plan_credit
|
||||
msg = _t("Assets per month: {0}").format(credits)
|
||||
wrapped_label(cTB, w_label, msg, col, add_padding=False)
|
||||
|
||||
next_renew = cTB.user.plan.next_subscription_renewal_date
|
||||
msg = _t("Renewal Date: {0}").format(next_renew)
|
||||
wrapped_label(cTB, w_label, msg, col, add_padding=False)
|
||||
|
||||
is_paused = cTB.is_paused_subscription()
|
||||
status = _t("Paused") if is_paused else _t("Active")
|
||||
msg = _t("Status: {0}").format(status)
|
||||
wrapped_label(cTB, w_label, msg, col, add_padding=False)
|
||||
|
||||
op = col.operator("poliigon.poliigon_link", text=_t("View Details"))
|
||||
op.mode = "credits"
|
||||
# TODO(Andreas): Figma did not contain any tooltips...
|
||||
op.tooltip = _t("View Details of Your Plan Online")
|
||||
|
||||
|
||||
def _build_still_loading(cTB) -> None:
|
||||
box_free = cTB.vBase.box()
|
||||
col = box_free.column()
|
||||
w_label = cTB.width_draw_ui - 20 * get_ui_scale(cTB)
|
||||
wrapped_label(
|
||||
cTB, w_label, "Fetching user data...", col, add_padding=False)
|
||||
|
||||
|
||||
def build_user(cTB) -> None:
|
||||
cTB.logger_ui.debug("build_user")
|
||||
|
||||
cTB.vBase.label(text=_t("Your Plan"))
|
||||
|
||||
if cTB.fetching_user_data:
|
||||
_build_still_loading(cTB)
|
||||
return
|
||||
|
||||
if cTB.is_free_user() or cTB.user.plan.plan_name is None:
|
||||
_build_section_free_user(cTB)
|
||||
else:
|
||||
_build_section_paid_plan(cTB)
|
||||
|
||||
cTB.vBase.separator()
|
||||
op = cTB.vBase.operator("poliigon.poliigon_user", text=_t("Log Out"))
|
||||
op.mode = "logout"
|
||||
op.tooltip = _t("Log Out of Poliigon")
|
||||
@@ -0,0 +1,47 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
|
||||
|
||||
def append_poliigon_groups_node_add(self, context) -> None:
|
||||
"""Appending to add node menu, for Poliigon node groups"""
|
||||
|
||||
self.layout.menu('POLIIGON_MT_add_node_groups')
|
||||
|
||||
|
||||
class POLIIGON_MT_add_node_groups(bpy.types.Menu):
|
||||
"""Menu for the Poliigon Shader node groups"""
|
||||
|
||||
bl_space_type = 'NODE_EDITOR'
|
||||
bl_label = _t("Poliigon Node Groups")
|
||||
|
||||
def draw(self, context):
|
||||
layout = self.layout
|
||||
col = layout.column(align=True)
|
||||
if bpy.app.version >= (2, 90):
|
||||
col.operator("poliigon.add_converter_node",
|
||||
text=_t("Mosaic")
|
||||
).node_type = "Mosaic_UV_Mapping"
|
||||
col.operator("poliigon.add_converter_node",
|
||||
text=_t("PBR mixer")
|
||||
).node_type = "Poliigon_Mixer"
|
||||
|
||||
col.separator()
|
||||
@@ -0,0 +1,124 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from ..modules.poliigon_core.api_remote_control_params import (
|
||||
CATEGORY_ALL,
|
||||
KEY_TAB_ONLINE)
|
||||
from ..modules.poliigon_core.assets import (
|
||||
AssetType,
|
||||
ASSET_TYPE_TO_CATEGORY_NAME)
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from .utils_dlg import get_ui_scale
|
||||
|
||||
|
||||
# TODO(Andreas): This will be an exciting module in terms of multilingual
|
||||
# TODO(Andreas): Would like to refactor this module
|
||||
|
||||
|
||||
# @timer
|
||||
def build_categories(cTB):
|
||||
cTB.logger_ui.debug("build_categories")
|
||||
|
||||
categories_selected = []
|
||||
categories = []
|
||||
subcategories = []
|
||||
if cTB.vAssetType != CATEGORY_ALL:
|
||||
for _asset_type in cTB.vCategories["poliigon"].keys():
|
||||
if cTB.vAssetType in [CATEGORY_ALL, _asset_type]:
|
||||
categories += cTB.vCategories["poliigon"][_asset_type].keys()
|
||||
categories = sorted(list(set(categories)))
|
||||
|
||||
if len(categories) > 0:
|
||||
category = ""
|
||||
categories_selected = []
|
||||
for _idx_sel in range(1, len(cTB.vActiveCat)):
|
||||
category += "/" + cTB.vActiveCat[_idx_sel]
|
||||
categories_selected.append(category)
|
||||
|
||||
subcategories = [
|
||||
_cat.split("/")[-1]
|
||||
for _cat in categories
|
||||
if _cat.startswith(category) and _cat != category
|
||||
]
|
||||
if len(subcategories) > 0:
|
||||
categories_selected.append("sub")
|
||||
|
||||
col_categories = cTB.vBase.column()
|
||||
|
||||
width_factor = len(categories_selected) + 1
|
||||
if cTB.width_draw_ui >= max(width_factor, 2) * 160 * get_ui_scale(cTB):
|
||||
row_categories = col_categories.row()
|
||||
else:
|
||||
row_categories = col_categories
|
||||
|
||||
row_sub_cat = row_categories.row(align=True)
|
||||
|
||||
type_hdri = ASSET_TYPE_TO_CATEGORY_NAME[AssetType.HDRI]
|
||||
type_model = ASSET_TYPE_TO_CATEGORY_NAME[AssetType.MODEL]
|
||||
type_tex = ASSET_TYPE_TO_CATEGORY_NAME[AssetType.TEXTURE]
|
||||
list_types = [CATEGORY_ALL, type_tex, type_model, type_hdri]
|
||||
|
||||
area = cTB.settings["area"]
|
||||
if cTB.search_free and area == KEY_TAB_ONLINE:
|
||||
lbl_button_cat = _t("Free")
|
||||
elif cTB.vAssetType == CATEGORY_ALL:
|
||||
lbl_button_cat = _t("Select Category")
|
||||
else:
|
||||
lbl_button_cat = cTB.vAssetType
|
||||
op = row_sub_cat.operator(
|
||||
"poliigon.poliigon_category", text=lbl_button_cat, icon="TRIA_DOWN"
|
||||
)
|
||||
op.data = "0@" + "@".join(list_types)
|
||||
|
||||
if len(categories_selected) == 0:
|
||||
col_categories.separator()
|
||||
return
|
||||
|
||||
for _idx_sel, _cat_sel in enumerate(categories_selected):
|
||||
row_sub_cat = row_categories.row(align=True)
|
||||
|
||||
if _idx_sel == 0:
|
||||
selected_categories = [
|
||||
_cat.split("/")[-1]
|
||||
for _cat in categories
|
||||
if len(_cat.split("/")) == 2
|
||||
]
|
||||
elif _cat_sel == "sub":
|
||||
selected_categories = subcategories
|
||||
else:
|
||||
cat_parent = "/".join(_cat_sel.split("/")[:-1])
|
||||
selected_categories = [
|
||||
_cat.split("/")[-1]
|
||||
for _cat in categories
|
||||
if _cat.startswith(cat_parent) and _cat != cat_parent
|
||||
]
|
||||
|
||||
selected_categories = sorted(list(set(selected_categories)))
|
||||
|
||||
lbl_button = _cat_sel.split("/")[-1]
|
||||
if _cat_sel == "sub":
|
||||
lbl_button = "All " + cTB.vActiveCat[-1]
|
||||
|
||||
selected_categories.insert(0, "All " + cTB.vActiveCat[_idx_sel])
|
||||
data_op = f"{_idx_sel + 1}@{'@'.join(selected_categories)}"
|
||||
op = row_sub_cat.operator(
|
||||
"poliigon.poliigon_category", text=lbl_button, icon="TRIA_DOWN"
|
||||
)
|
||||
op.data = data_op
|
||||
|
||||
col_categories.separator()
|
||||
@@ -0,0 +1,282 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from ..modules.poliigon_core.notifications import ActionType
|
||||
from ..constants import URLS_BLENDER
|
||||
from .utils_dlg import (
|
||||
get_ui_scale,
|
||||
wrapped_label)
|
||||
# from .. import reporting
|
||||
|
||||
|
||||
def build_mode(url, action, id_notice):
|
||||
return "notify@{}@{}@{}".format(url, action, id_notice)
|
||||
|
||||
|
||||
def _draw_notification_open_url_single_row(cTB, notice, first_row, icon) -> None:
|
||||
# Single row with text + button.
|
||||
# TODO: generalize this for notification message and length,
|
||||
# and if dismiss is included.
|
||||
# During SOFT-780 this has been changed for POPUP_MESSAGE in a
|
||||
# very simplistic way
|
||||
# (commit: https://github.com/poliigon/poliigon-addon-blender/pull/278/commits/00296ab70288893a023a6705d52eb4505ce36897).
|
||||
# When addressing this properly,
|
||||
# make sure to address it for all notification types.
|
||||
first_row.alert = True
|
||||
first_row.label(text=notice.title)
|
||||
first_row.alert = False
|
||||
op = first_row.operator(
|
||||
"poliigon.poliigon_link",
|
||||
icon=icon,
|
||||
text=notice.label,
|
||||
)
|
||||
if notice.tooltip != "":
|
||||
op.tooltip = notice.tooltip
|
||||
op.mode = build_mode(
|
||||
notice.url,
|
||||
notice.label,
|
||||
notice.id_notice)
|
||||
|
||||
|
||||
def _draw_notification_open_url_two_rows(
|
||||
cTB, notice, first_row, main_col, icon) -> None:
|
||||
# Two rows (or more, if text wrapping).
|
||||
col = first_row.column(align=True)
|
||||
col.alert = True
|
||||
# Empirically found squaring worked best for 1 & 2x displays,
|
||||
# which accounts for the box+panel padding and the 'x' button.
|
||||
if notice.allow_dismiss:
|
||||
padding_width = 32 * get_ui_scale(cTB)
|
||||
else:
|
||||
padding_width = 17 * get_ui_scale(cTB)
|
||||
wrapped_label(
|
||||
cTB, cTB.width_draw_ui - padding_width, notice.title, col)
|
||||
col.alert = False
|
||||
|
||||
second_row = main_col.row(align=True)
|
||||
second_row.scale_y = 1.0
|
||||
op = second_row.operator(
|
||||
"poliigon.poliigon_link",
|
||||
icon=icon,
|
||||
text=notice.label,
|
||||
)
|
||||
if notice.tooltip != "":
|
||||
op.tooltip = notice.tooltip
|
||||
op.mode = build_mode(
|
||||
notice.url,
|
||||
notice.label,
|
||||
notice.id_notice)
|
||||
|
||||
|
||||
def _draw_notification_open_url(
|
||||
cTB, notice, first_row, main_col, panel_width, icon) -> None:
|
||||
# Empirical for width for "Beta addon: [Take survey]" specifically.
|
||||
single_row_width = 250
|
||||
if panel_width > single_row_width:
|
||||
_draw_notification_open_url_single_row(cTB, notice, first_row, icon)
|
||||
else:
|
||||
_draw_notification_open_url_two_rows(
|
||||
cTB, notice, first_row, main_col, icon)
|
||||
|
||||
|
||||
def _draw_notification_update_ready_single_row(cTB, notice, first_row, icon) -> None:
|
||||
# Single row with text + button.
|
||||
first_row.alert = True
|
||||
first_row.label(text=notice.title)
|
||||
first_row.alert = False
|
||||
splitrow = first_row.split(factor=0.7, align=True)
|
||||
splitcol = splitrow.split(align=True)
|
||||
|
||||
label = notice.label
|
||||
if label == "":
|
||||
label = notice.title
|
||||
|
||||
op = splitcol.operator(
|
||||
"poliigon.poliigon_link",
|
||||
icon=icon,
|
||||
text=label,
|
||||
)
|
||||
if notice.tooltip != "":
|
||||
op.tooltip = notice.tooltip
|
||||
op.mode = build_mode(
|
||||
notice.download_url, notice.label, notice.id_notice)
|
||||
|
||||
splitcol = splitrow.split(align=True)
|
||||
op = splitcol.operator(
|
||||
"poliigon.poliigon_link",
|
||||
text="Logs",
|
||||
)
|
||||
# if notice.tooltip is not None:
|
||||
op.tooltip = _t("See changes in this version")
|
||||
op.mode = build_mode(
|
||||
URLS_BLENDER["changelog"], "Logs", notice.id_notice)
|
||||
|
||||
|
||||
def _draw_notification_update_ready_two_rows(
|
||||
cTB, notice, first_row, main_col, icon) -> None:
|
||||
# Two rows (or more, if text wrapping).
|
||||
col = first_row.column(align=True)
|
||||
col.alert = True
|
||||
if notice.allow_dismiss:
|
||||
padding_width = 32 * get_ui_scale(cTB)
|
||||
else:
|
||||
padding_width = 17 * get_ui_scale(cTB)
|
||||
wrapped_label(
|
||||
cTB, cTB.width_draw_ui - padding_width, notice.title, col)
|
||||
col.alert = False
|
||||
|
||||
label = notice.label
|
||||
if label == "":
|
||||
label = notice.title
|
||||
|
||||
second_row = main_col.row(align=True)
|
||||
splitrow = second_row.split(factor=0.7, align=True)
|
||||
splitcol = splitrow.split(align=True)
|
||||
op = splitcol.operator(
|
||||
"poliigon.poliigon_link",
|
||||
icon=icon,
|
||||
text=label,
|
||||
)
|
||||
if notice.tooltip != "":
|
||||
op.tooltip = notice.tooltip
|
||||
op.mode = build_mode(
|
||||
notice.download_url, notice.label, notice.id_notice)
|
||||
splitcol = splitrow.split(align=True)
|
||||
op = splitcol.operator(
|
||||
"poliigon.poliigon_link",
|
||||
text="Logs",
|
||||
)
|
||||
op.tooltip = _t("See changes in this version")
|
||||
op.mode = build_mode(
|
||||
URLS_BLENDER["changelog"], "Logs", notice.id_notice)
|
||||
|
||||
|
||||
def _draw_notification_update_ready(
|
||||
cTB, notice, first_row, main_col, panel_width, icon) -> None:
|
||||
# Empirical for width for "Update ready: Download | logs".
|
||||
single_row_width = 300
|
||||
if panel_width > single_row_width:
|
||||
_draw_notification_update_ready_single_row(
|
||||
cTB, notice, first_row, icon)
|
||||
else:
|
||||
_draw_notification_update_ready_two_rows(
|
||||
cTB, notice, first_row, main_col, icon)
|
||||
|
||||
|
||||
def _draw_notification_popup_message_two_rows(
|
||||
cTB, notice, first_row, main_col, icon) -> bpy.types.Operator:
|
||||
# Two rows (or more, if text wrapping).
|
||||
col = first_row.column(align=True)
|
||||
col.alert = notice.alert
|
||||
# Empirically found squaring worked best for 1 & 2x displays,
|
||||
# which accounts for the box+panel padding and the 'x' button.
|
||||
if notice.allow_dismiss:
|
||||
padding_width = 32 * get_ui_scale(cTB)
|
||||
else:
|
||||
padding_width = 17 * get_ui_scale(cTB)
|
||||
wrapped_label(
|
||||
cTB, cTB.width_draw_ui - padding_width, notice.title, col)
|
||||
col.alert = False
|
||||
|
||||
second_row = main_col.row(align=True)
|
||||
second_row.scale_y = 1.0
|
||||
op = second_row.operator(
|
||||
"poliigon.popup_message",
|
||||
icon=icon,
|
||||
text="View",
|
||||
)
|
||||
return op
|
||||
|
||||
|
||||
def _draw_notification_popup_message(
|
||||
cTB, notice, first_row, main_col, panel_width, icon) -> None:
|
||||
op = _draw_notification_popup_message_two_rows(
|
||||
cTB, notice, first_row, main_col, icon)
|
||||
|
||||
op.message_body = notice.body
|
||||
op.notice_id = notice.id_notice
|
||||
if notice.tooltip != "":
|
||||
op.tooltip = notice.tooltip
|
||||
if notice.url != "":
|
||||
op.message_url = notice.url
|
||||
|
||||
|
||||
def _draw_notification_run_operator(cTB, notice, first_row, icon) -> None:
|
||||
# Single row with only a button.
|
||||
op = first_row.operator(
|
||||
"poliigon.notice_operator",
|
||||
text=notice.title,
|
||||
icon=icon,
|
||||
)
|
||||
op.notice_id = notice.id_notice
|
||||
op.ops_name = notice.ops_name
|
||||
op.tooltip = notice.tooltip
|
||||
|
||||
|
||||
# TODO(Andreas): deactivated reporting here, as I needed a third parameter and
|
||||
# was not able to quickly make handle_draw() work
|
||||
# @reporting.handle_draw()
|
||||
def notification_banner(cTB, layout):
|
||||
"""General purpose notification banner UI draw element."""
|
||||
|
||||
notice = cTB.notify.get_top_notice()
|
||||
|
||||
if notice is None:
|
||||
return
|
||||
|
||||
box = layout.box()
|
||||
row = box.row(align=True)
|
||||
main_col = row.column(align=True)
|
||||
|
||||
scale = max(get_ui_scale(cTB), 1)
|
||||
panel_width = cTB.width_draw_ui / scale
|
||||
|
||||
first_row = main_col.row(align=False)
|
||||
x_row = first_row # x_row is the row to add the x button to, if there.
|
||||
|
||||
# Only purpose is to trigger view signal (only once)
|
||||
cTB.notify.notification_popup(notice, do_signal_view=True)
|
||||
|
||||
icon = notice.icon
|
||||
if icon is None:
|
||||
icon = "NONE"
|
||||
|
||||
if notice.action == ActionType.OPEN_URL:
|
||||
_draw_notification_open_url(
|
||||
cTB, notice, first_row, main_col, panel_width, icon)
|
||||
elif notice.action == ActionType.UPDATE_READY:
|
||||
_draw_notification_update_ready(
|
||||
cTB, notice, first_row, main_col, panel_width, icon)
|
||||
elif notice.action == ActionType.POPUP_MESSAGE:
|
||||
_draw_notification_popup_message(
|
||||
cTB, notice, first_row, main_col, panel_width, icon)
|
||||
elif notice.action == ActionType.RUN_OPERATOR:
|
||||
_draw_notification_run_operator(cTB, notice, first_row, icon)
|
||||
else:
|
||||
main_col.label(text=notice.title)
|
||||
cTB.logger_ui.error("Invalid notifcation type")
|
||||
|
||||
if notice.allow_dismiss:
|
||||
right_col = x_row.column(align=True)
|
||||
right_col.operator(
|
||||
"poliigon.close_notification", icon="X", text="", emboss=False)
|
||||
|
||||
layout.separator()
|
||||
@@ -0,0 +1,139 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.api_remote_control_params import (
|
||||
KEY_TAB_IMPORTED,
|
||||
KEY_TAB_MY_ASSETS,
|
||||
KEY_TAB_ONLINE)
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
|
||||
|
||||
def _draw_unlimited_icon(cTB, *, row: bpy.types.UILayout) -> None:
|
||||
icon_value = cTB.ui_icons["LOGO_unlimited"].icon_id
|
||||
op_icon = row.operator(
|
||||
"poliigon.poliigon_setting", text="", emboss=True, icon_value=icon_value)
|
||||
op_icon.mode = "show_user"
|
||||
# TODO(Andreas): Tooltip???
|
||||
op_icon.tooltip = _t("Switch to your account details")
|
||||
|
||||
|
||||
def _draw_asset_balance(cTB, *, row: bpy.types.UILayout) -> None:
|
||||
if cTB.is_unlimited_user():
|
||||
_draw_unlimited_icon(cTB, row=row)
|
||||
return
|
||||
|
||||
# Asset balance
|
||||
credits = cTB.get_user_credits()
|
||||
balance_icon = cTB.ui_icons["ICON_asset_balance"].icon_id
|
||||
if cTB.is_paused_subscription() and credits <= 0:
|
||||
balance_icon = cTB.ui_icons["ICON_subscription_paused"].icon_id
|
||||
|
||||
op_credits = row.operator(
|
||||
"poliigon.poliigon_setting",
|
||||
text=str(credits),
|
||||
icon_value=balance_icon # TODO: use new asset icon
|
||||
)
|
||||
op_credits.tooltip = _t(
|
||||
"Your asset balance shows how many assets you can\n"
|
||||
"purchase. Free assets and downloading assets you\n"
|
||||
"already own doesn’t affect your balance")
|
||||
op_credits.mode = "show_user"
|
||||
|
||||
|
||||
def _add_asset_tab(cTB,
|
||||
row: bpy.types.UILayout,
|
||||
*,
|
||||
tab: str,
|
||||
mode: str,
|
||||
icon: str = "NONE",
|
||||
icon_value: int = 0,
|
||||
tooltip: str = ""
|
||||
) -> None:
|
||||
no_user = not cTB.settings["show_user"]
|
||||
no_settings = not cTB.settings["show_settings"]
|
||||
no_user_or_settings = no_user and no_settings
|
||||
|
||||
col = row.column(align=True)
|
||||
is_tab_active = cTB.settings["area"] == tab
|
||||
op = col.operator(
|
||||
"poliigon.poliigon_setting",
|
||||
text="",
|
||||
icon=icon,
|
||||
icon_value=icon_value,
|
||||
depress=is_tab_active and no_user_or_settings,
|
||||
)
|
||||
op.mode = mode
|
||||
op.tooltip = tooltip
|
||||
|
||||
|
||||
# @timer
|
||||
def build_areas(cTB):
|
||||
cTB.logger_ui.debug("build_areas")
|
||||
cTB.initial_view_screen()
|
||||
|
||||
row = cTB.vBase.row(align=True)
|
||||
row.scale_x = 1.1
|
||||
row.scale_y = 1.1
|
||||
|
||||
_add_asset_tab(
|
||||
cTB,
|
||||
row,
|
||||
tab=KEY_TAB_ONLINE,
|
||||
mode="area_poliigon",
|
||||
icon="HOME",
|
||||
tooltip=_t("Show Poliigon Assets"))
|
||||
_add_asset_tab(
|
||||
cTB,
|
||||
row,
|
||||
tab=KEY_TAB_MY_ASSETS,
|
||||
mode="area_my_assets",
|
||||
icon_value=cTB.ui_icons["ICON_myassets"].icon_id,
|
||||
tooltip=_t("Show My Assets"))
|
||||
_add_asset_tab(
|
||||
cTB,
|
||||
row,
|
||||
tab=KEY_TAB_IMPORTED,
|
||||
mode="area_imported",
|
||||
icon="OUTLINER_OB_GROUP_INSTANCE",
|
||||
tooltip=_t("Show Imported Assets"))
|
||||
|
||||
op = row.operator(
|
||||
"poliigon.poliigon_setting",
|
||||
text="",
|
||||
icon_value=cTB.ui_icons["ICON_poliigon"].icon_id,
|
||||
depress=cTB.settings["show_user"],
|
||||
)
|
||||
op.mode = "my_account"
|
||||
op.tooltip = _t("Show Your Account Details")
|
||||
|
||||
row.separator()
|
||||
|
||||
row_prefs = row.row(align=True)
|
||||
row_prefs.alignment = "RIGHT"
|
||||
|
||||
_draw_asset_balance(cTB, row=row_prefs)
|
||||
|
||||
_ = row_prefs.operator(
|
||||
"poliigon.open_preferences",
|
||||
text="",
|
||||
icon="PREFERENCES",
|
||||
).set_focus = "all"
|
||||
|
||||
cTB.vBase.separator()
|
||||
@@ -0,0 +1,141 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from ..modules.poliigon_core.upgrade_content import UpgradeContent
|
||||
from ..dialogs.utils_dlg import (
|
||||
get_ui_scale,
|
||||
wrapped_label)
|
||||
|
||||
|
||||
def _draw_banner(cTB, upgrade_content: UpgradeContent) -> None:
|
||||
"""Draws the actual banner and its buttons."""
|
||||
|
||||
width = cTB.width_draw_ui - 42 * get_ui_scale(cTB)
|
||||
|
||||
row = cTB.vBase.row(align=True)
|
||||
row.scale_x = 1.1
|
||||
row.scale_y = 1.1
|
||||
|
||||
box = row.box()
|
||||
col = box.column()
|
||||
|
||||
text = upgrade_content.banner_primary_text
|
||||
label = upgrade_content.banner_button_text
|
||||
key_icon = upgrade_content.icon_path
|
||||
|
||||
wrapped_label(
|
||||
cTB, width=width, text=text, container=col)
|
||||
|
||||
row_buttons = col.row(align=True)
|
||||
if upgrade_content.open_popup:
|
||||
op = row_buttons.operator(
|
||||
"poliigon.popup_change_plan",
|
||||
text=label,
|
||||
icon_value=cTB.ui_icons[key_icon].icon_id)
|
||||
op.tooltip = _t("By clicking here, we will change the subscription "
|
||||
"plan as shown above")
|
||||
if upgrade_content.allow_dismiss:
|
||||
op = row_buttons.operator(
|
||||
"poliigon.popup_change_plan_dismiss",
|
||||
text="",
|
||||
icon="PANEL_CLOSE")
|
||||
else:
|
||||
op = row_buttons.operator(
|
||||
"poliigon.poliigon_link",
|
||||
text=label,
|
||||
icon_value=cTB.ui_icons[key_icon].icon_id)
|
||||
op.mode = "subscribe_banner"
|
||||
|
||||
|
||||
def _draw_banner_in_progress(cTB, upgrade_content: UpgradeContent) -> None:
|
||||
"""Draws an 'upgrade in progress' banner."""
|
||||
|
||||
width = cTB.width_draw_ui - 42 * get_ui_scale(cTB)
|
||||
|
||||
row = cTB.vBase.row(align=True)
|
||||
row.scale_x = 1.1
|
||||
row.scale_y = 1.1
|
||||
|
||||
box = row.box()
|
||||
col = box.column()
|
||||
|
||||
primary = upgrade_content.upgrading_primary_text
|
||||
secondary = upgrade_content.upgrading_secondary_text
|
||||
text = f"{primary} {secondary}" # three spaces are deliberate
|
||||
wrapped_label(cTB, width=width, text=text, container=col)
|
||||
|
||||
|
||||
def _draw_banner_finished(cTB, upgrade_content: UpgradeContent) -> None:
|
||||
"""Draws the final sucess/error banner."""
|
||||
|
||||
width = cTB.width_draw_ui - 42 * get_ui_scale(cTB)
|
||||
|
||||
row = cTB.vBase.row(align=True)
|
||||
row.scale_x = 1.1
|
||||
row.scale_y = 1.1
|
||||
|
||||
box = row.box()
|
||||
col = box.column()
|
||||
|
||||
if cTB.msg_plan_upgrade_finished is not None:
|
||||
text = cTB.msg_plan_upgrade_finished
|
||||
elif cTB.error_plan_upgrade is not None:
|
||||
head = upgrade_content.error_popup_title
|
||||
text = upgrade_content.error_popup_text.format(
|
||||
cTB.error_plan_upgrade)
|
||||
text = f"{head}: {text}"
|
||||
else:
|
||||
head = upgrade_content.success_popup_title
|
||||
text = upgrade_content.success_popup_text
|
||||
text = f"{head}: {text}"
|
||||
|
||||
cTB.msg_plan_upgrade_finished = text
|
||||
wrapped_label(cTB, width=width, text=text, container=col)
|
||||
row.operator(
|
||||
"poliigon.banner_finish_dismiss",
|
||||
text="",
|
||||
icon="PANEL_CLOSE")
|
||||
|
||||
|
||||
# @timer
|
||||
def build_upgrade_banner(cTB) -> None:
|
||||
"""Draws an 'upgrade subscription plan' banner, including a progress
|
||||
banner and a success/error banner.
|
||||
"""
|
||||
|
||||
cTB.logger_ui.debug("build_upgrade_paths")
|
||||
|
||||
if cTB.user is None:
|
||||
return
|
||||
if cTB.upgrade_manager is None:
|
||||
return
|
||||
if cTB.upgrade_manager.content is None:
|
||||
return
|
||||
|
||||
upgrade_content = cTB.upgrade_manager.content
|
||||
if cTB.plan_upgrade_finished:
|
||||
_draw_banner_finished(cTB, upgrade_content)
|
||||
elif cTB.plan_upgrade_in_progress:
|
||||
_draw_banner_in_progress(cTB, upgrade_content)
|
||||
elif cTB.upgrade_manager.check_show_banner():
|
||||
_draw_banner(cTB, upgrade_content)
|
||||
else:
|
||||
return
|
||||
|
||||
cTB.vBase.separator()
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,88 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from .utils_dlg import (
|
||||
get_ui_scale,
|
||||
wrapped_label)
|
||||
|
||||
|
||||
# @timer
|
||||
def build_library(cTB):
|
||||
cTB.logger_ui.debug("build_library")
|
||||
factor_space = 1.0 / cTB.width_draw_ui
|
||||
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui,
|
||||
_t("Welcome to the Poliigon Addon!"),
|
||||
cTB.vBase
|
||||
)
|
||||
|
||||
cTB.vBase.separator()
|
||||
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui,
|
||||
_t("Select where you will store Poliigon assets."),
|
||||
cTB.vBase
|
||||
)
|
||||
|
||||
cTB.vBase.separator()
|
||||
|
||||
box_row = cTB.vBase.box().row()
|
||||
box_row.separator(factor=factor_space)
|
||||
col = box_row.column()
|
||||
box_row.separator(factor=factor_space)
|
||||
|
||||
col.label(text=_t("Library Location"))
|
||||
|
||||
label_library = cTB.settings["set_library"]
|
||||
if label_library == "":
|
||||
label_library = _t("Select Location")
|
||||
|
||||
op = col.operator(
|
||||
"poliigon.poliigon_library",
|
||||
icon="FILE_FOLDER",
|
||||
text=label_library,
|
||||
)
|
||||
op.mode = "set_library"
|
||||
op.directory = cTB.settings["set_library"]
|
||||
op.tooltip = _t("Select Location")
|
||||
|
||||
col.separator()
|
||||
row_confirm = col.row()
|
||||
row_confirm.scale_y = 1.5
|
||||
|
||||
op = row_confirm.operator(
|
||||
"poliigon.poliigon_setting", text=_t("Confirm"))
|
||||
op.mode = "set_library"
|
||||
op.tooltip = _t("Confirm Library location")
|
||||
|
||||
col.separator()
|
||||
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui - 30 * get_ui_scale(cTB),
|
||||
_t("You can change this and add more directories in the settings "
|
||||
"at any time."),
|
||||
col
|
||||
)
|
||||
|
||||
col.separator()
|
||||
cTB.vBase.separator()
|
||||
@@ -0,0 +1,312 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from .utils_dlg import (
|
||||
get_ui_scale,
|
||||
wrapped_label)
|
||||
from ..toolbox import c_Toolbox
|
||||
|
||||
|
||||
ERR_CREDS_FORMAT = _t("Invalid email format/password length.")
|
||||
# TODO(Andreas): Currently not sure about this error
|
||||
ERR_LOGIN_TIMEOUT = _t("Login with website timed out, please try again")
|
||||
|
||||
|
||||
def _draw_welcome_or_error(cTB: c_Toolbox, layout: bpy.types.UILayout) -> None:
|
||||
if cTB.user_invalidated() and not cTB.login_in_progress:
|
||||
layout.separator()
|
||||
|
||||
if cTB.last_login_error == ERR_LOGIN_TIMEOUT:
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui,
|
||||
cTB.last_login_error,
|
||||
layout,
|
||||
icon="ERROR"
|
||||
)
|
||||
else:
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui,
|
||||
_t("Warning : You have been logged out as this account was "
|
||||
"signed in on another device."),
|
||||
layout,
|
||||
icon="ERROR"
|
||||
)
|
||||
|
||||
else:
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui,
|
||||
_t("Welcome to the Poliigon Addon!"),
|
||||
layout
|
||||
)
|
||||
|
||||
layout.separator()
|
||||
|
||||
|
||||
def _draw_share_addon_errors(cTB: c_Toolbox,
|
||||
layout: bpy.types.UILayout,
|
||||
enabled: bool = True) -> None:
|
||||
# Show terms of service, optin/out.
|
||||
row_opt = layout.row()
|
||||
row_opt.alignment = "LEFT"
|
||||
row_opt.enabled = enabled
|
||||
# __spec__.parent since __package__ got deprecated
|
||||
# Since this module moved into dialogs,
|
||||
# we need to split off .dialogs
|
||||
spec_parent = __spec__.parent
|
||||
spec_parent = spec_parent.split(".")[0]
|
||||
prefs = bpy.context.preferences.addons.get(spec_parent, None)
|
||||
row_opt.prop(prefs.preferences, "reporting_opt_in", text="")
|
||||
twidth = cTB.width_draw_ui - 42 * get_ui_scale(cTB)
|
||||
wrapped_label(cTB, twidth, _t("Share addon errors / usage"), row_opt)
|
||||
|
||||
|
||||
def _draw_switch_email_login(col: bpy.types.UILayout,
|
||||
enabled: bool = True) -> None:
|
||||
row_login_email = col.row()
|
||||
row_login_email.enabled = enabled
|
||||
op_login_email = row_login_email.operator("poliigon.poliigon_user",
|
||||
text=_t("Login via email"),
|
||||
emboss=False)
|
||||
op_login_email.mode = "login_switch_to_email"
|
||||
op_login_email.tooltip = _t("Login via email")
|
||||
|
||||
|
||||
def _draw_browser_login(cTB: c_Toolbox, col: bpy.types.UILayout) -> None:
|
||||
if cTB.login_in_progress:
|
||||
_draw_share_addon_errors(cTB, col, enabled=False)
|
||||
|
||||
row_buttons = col.row(align=True)
|
||||
row_buttons.scale_y = 1.25
|
||||
|
||||
col1 = row_buttons.column(align=True)
|
||||
op_login_website = col1.operator("poliigon.poliigon_user",
|
||||
text=_t("Opening browser..."),
|
||||
depress=True)
|
||||
op_login_website.mode = "none"
|
||||
op_login_website.tooltip = _t("Complete login via opened webpage")
|
||||
col1.enabled = False
|
||||
|
||||
col2 = row_buttons.column(align=True)
|
||||
op_login_cancel = col2.operator("poliigon.poliigon_user",
|
||||
text="",
|
||||
icon="X")
|
||||
op_login_cancel.mode = "login_cancel"
|
||||
op_login_cancel.tooltip = _t("Cancel Log In")
|
||||
|
||||
col.separator()
|
||||
|
||||
_draw_switch_email_login(col, enabled=False)
|
||||
else:
|
||||
_draw_share_addon_errors(cTB, col)
|
||||
|
||||
row_button = col.row()
|
||||
row_button.scale_y = 1.25
|
||||
|
||||
op_login_website = row_button.operator("poliigon.poliigon_user",
|
||||
text=_t("Login via Browser"))
|
||||
op_login_website.mode = "login_with_website"
|
||||
op_login_website.tooltip = _t("Login via Browser")
|
||||
|
||||
col.separator()
|
||||
|
||||
_draw_switch_email_login(col)
|
||||
|
||||
|
||||
def _draw_email_login(cTB: c_Toolbox, col: bpy.types.UILayout) -> None:
|
||||
vProps = bpy.context.window_manager.poliigon_props
|
||||
|
||||
col.label(text="Email")
|
||||
|
||||
row = col.row(align=True)
|
||||
row.prop(vProps, "vEmail")
|
||||
|
||||
col_x = row.column(align=True)
|
||||
op = col_x.operator("poliigon.poliigon_setting",
|
||||
text="",
|
||||
icon="X")
|
||||
op.tooltip = _t("Clear Email")
|
||||
op.mode = "clear_email"
|
||||
|
||||
error_credentials = False
|
||||
has_login_error = cTB.last_login_error is not None
|
||||
error_login = has_login_error and cTB.last_login_error != ERR_LOGIN_TIMEOUT
|
||||
if error_login and "@" not in vProps.vEmail:
|
||||
error_credentials = True
|
||||
|
||||
col.separator()
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui - 40 * get_ui_scale(cTB),
|
||||
_t("Email format is invalid e.g. john@example.org"),
|
||||
col,
|
||||
icon="ERROR")
|
||||
col.separator()
|
||||
|
||||
col.label(text=_t("Password"))
|
||||
|
||||
row = col.row(align=True)
|
||||
|
||||
if cTB.settings["show_pass"]:
|
||||
row.prop(vProps, "vPassShow")
|
||||
vPass = vProps.vPassShow
|
||||
|
||||
else:
|
||||
row.prop(vProps, "vPassHide")
|
||||
vPass = vProps.vPassHide
|
||||
|
||||
col_x = row.column(align=True)
|
||||
|
||||
op = col_x.operator("poliigon.poliigon_setting",
|
||||
text="",
|
||||
icon="X")
|
||||
op.tooltip = _t("Clear Password")
|
||||
op.mode = "clear_pass"
|
||||
|
||||
if error_login and len(vPass) < 6:
|
||||
error_credentials = True
|
||||
|
||||
col.separator()
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui - 40 * get_ui_scale(cTB),
|
||||
_t("Password should be at least 6 characters."),
|
||||
col,
|
||||
icon="ERROR")
|
||||
col.separator()
|
||||
|
||||
_draw_share_addon_errors(cTB, col)
|
||||
|
||||
enable_login_button = len(vProps.vEmail) > 0 and len(vPass) > 0
|
||||
|
||||
row = col.row()
|
||||
row.scale_y = 1.25
|
||||
|
||||
if cTB.login_in_progress:
|
||||
op_login = row.operator("poliigon.poliigon_setting",
|
||||
text=_t("Logging In..."),
|
||||
depress=enable_login_button)
|
||||
op_login.mode = "none"
|
||||
op_login.tooltip = _t("Logging In...")
|
||||
row.enabled = False
|
||||
else:
|
||||
op_login = row.operator("poliigon.poliigon_user",
|
||||
text=_t("Login via email"))
|
||||
op_login.mode = "login"
|
||||
op_login.tooltip = _t("Login via email")
|
||||
|
||||
row.enabled = enable_login_button
|
||||
|
||||
if cTB.last_login_error == ERR_CREDS_FORMAT:
|
||||
# Will draw above with more specific messages if condition true, like
|
||||
# invalid email format or password length.
|
||||
pass
|
||||
elif error_login and not error_credentials:
|
||||
col.separator()
|
||||
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui - 40 * get_ui_scale(cTB),
|
||||
cTB.last_login_error,
|
||||
col,
|
||||
icon="ERROR",
|
||||
)
|
||||
|
||||
col.separator()
|
||||
|
||||
op_forgot = col.operator("poliigon.poliigon_link",
|
||||
text=_t("Forgot Password?"),
|
||||
emboss=False)
|
||||
op_forgot.mode = "forgot"
|
||||
op_forgot.tooltip = _t("Reset your Poliigon password")
|
||||
|
||||
op_login_website = col.operator("poliigon.poliigon_user",
|
||||
text=_t("Login via Browser"),
|
||||
emboss=False)
|
||||
op_login_website.mode = "login_switch_to_browser"
|
||||
op_login_website.tooltip = _t("Login via Browser")
|
||||
|
||||
|
||||
def _draw_login(cTB, layout: bpy.types.UILayout) -> None:
|
||||
spc = 1.0 / cTB.width_draw_ui
|
||||
|
||||
box = layout.box()
|
||||
row = box.row()
|
||||
row.separator(factor=spc)
|
||||
col = row.column()
|
||||
row.separator(factor=spc)
|
||||
|
||||
twidth = cTB.width_draw_ui - 42 * get_ui_scale(cTB)
|
||||
wrapped_label(cTB, twidth, _t("Login"), col)
|
||||
col.separator()
|
||||
|
||||
if cTB.login_mode_browser:
|
||||
_draw_browser_login(cTB, col)
|
||||
|
||||
else:
|
||||
_draw_email_login(cTB, col)
|
||||
|
||||
|
||||
def _draw_signup(cTB, layout: bpy.types.UILayout) -> None:
|
||||
wrapped_label(
|
||||
cTB,
|
||||
cTB.width_draw_ui,
|
||||
_t("Don't have an account?"),
|
||||
layout,
|
||||
)
|
||||
op_signup = layout.operator("poliigon.poliigon_link",
|
||||
text=_t("Sign Up"))
|
||||
op_signup.mode = "signup"
|
||||
op_signup.tooltip = _t("Create a Poliigon account")
|
||||
|
||||
|
||||
def _draw_legal(layout: bpy.types.UILayout) -> None:
|
||||
row = layout.row()
|
||||
col = row.column(align=True)
|
||||
|
||||
op_terms = col.operator("poliigon.poliigon_link",
|
||||
text=_t("Terms & Conditions"),
|
||||
emboss=False)
|
||||
op_terms.tooltip = _t("View the terms and conditions page")
|
||||
op_terms.mode = "terms"
|
||||
|
||||
op_privacy = col.operator("poliigon.poliigon_link",
|
||||
text=_t("Privacy Policy"),
|
||||
emboss=False)
|
||||
op_privacy.tooltip = _t("View the Privacy Policy ")
|
||||
op_privacy.mode = "privacy"
|
||||
|
||||
|
||||
# @timer
|
||||
def build_login(cTB):
|
||||
cTB.logger_ui.debug("build_login")
|
||||
|
||||
if cTB.last_login_error is not None:
|
||||
cTB.login_in_progress = 0
|
||||
|
||||
_draw_welcome_or_error(cTB, cTB.vBase)
|
||||
_draw_login(cTB, cTB.vBase)
|
||||
cTB.vBase.separator()
|
||||
_draw_signup(cTB, cTB.vBase)
|
||||
cTB.vBase.separator()
|
||||
_draw_legal(cTB.vBase)
|
||||
@@ -0,0 +1,88 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from typing import List, Optional
|
||||
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from .utils_dlg import (
|
||||
get_ui_scale,
|
||||
wrapped_label)
|
||||
|
||||
|
||||
def open_popup(cTB,
|
||||
title: str = "",
|
||||
msg: str = "",
|
||||
buttons: List[str] = [_t("OK")],
|
||||
commands: List[Optional[str]] = [None],
|
||||
mode: str = None,
|
||||
w_limit: int = 0
|
||||
) -> None:
|
||||
cTB.logger_ui.debug(f"open_popup mode={mode}, w_limit={w_limit}"
|
||||
f" title={title}, msg={msg},\n"
|
||||
f" buttons={buttons},\n"
|
||||
f" commands={commands}")
|
||||
|
||||
def draw(self, context):
|
||||
layout = self.layout
|
||||
|
||||
col = layout.column(align=True)
|
||||
|
||||
icon = "INFO"
|
||||
if mode == "question":
|
||||
icon = "QUESTION"
|
||||
elif mode == "error":
|
||||
icon = "ERROR"
|
||||
|
||||
col.label(text=title, icon=icon)
|
||||
|
||||
col.separator()
|
||||
|
||||
if w_limit == 0:
|
||||
col.label(text=msg)
|
||||
else:
|
||||
wrapped_label(cTB, w_limit * get_ui_scale(cTB), msg, col)
|
||||
|
||||
col.separator()
|
||||
col.separator()
|
||||
|
||||
vRow = col.row()
|
||||
for idx_button in range(len(buttons)):
|
||||
if commands[idx_button] in [None, "cancel"]:
|
||||
op = vRow.operator(
|
||||
"poliigon.poliigon_setting",
|
||||
text=buttons[idx_button])
|
||||
op.mode = "none"
|
||||
elif commands[idx_button] == "credits":
|
||||
op = vRow.operator(
|
||||
"poliigon.poliigon_link",
|
||||
text=_t("Add Credits"),
|
||||
depress=1)
|
||||
op.mode = "credits"
|
||||
elif commands[idx_button] == "open_p4b_url":
|
||||
op = vRow.operator(
|
||||
"poliigon.poliigon_link",
|
||||
text=buttons[idx_button],
|
||||
depress=1)
|
||||
op.mode = "p4b"
|
||||
elif commands[idx_button] == "check_update":
|
||||
vRow.operator("poliigon.check_update",
|
||||
text=buttons[idx_button])
|
||||
|
||||
bpy.context.window_manager.popover(draw)
|
||||
@@ -0,0 +1,376 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import os
|
||||
import re
|
||||
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.api_remote_control_params import (
|
||||
CATEGORY_ALL,
|
||||
get_search_key,
|
||||
KEY_TAB_IMPORTED)
|
||||
from ..modules.poliigon_core.assets import (
|
||||
AssetData,
|
||||
AssetType,
|
||||
ModelType)
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
from ..operators.operator_material import set_op_mat_disp_strength
|
||||
# TODO(SOFT-2421): Deactivated as it seems to have unwanted side effects.
|
||||
# from .dlg_assets import _draw_button_quick_preview
|
||||
from .utils_dlg import (
|
||||
check_convention,
|
||||
get_model_op_details,
|
||||
safe_size_apply)
|
||||
from .. import reporting
|
||||
|
||||
|
||||
def show_quick_menu(
|
||||
cTB, asset_data: AssetData, hide_detail_view: bool = False) -> None:
|
||||
"""Generates the quick options menu next to an asset in the UI grid."""
|
||||
|
||||
asset_type_data = asset_data.get_type_data()
|
||||
asset_name = asset_data.asset_name
|
||||
asset_id = asset_data.asset_id
|
||||
asset_type = asset_data.asset_type
|
||||
credits = 0 if asset_data.credits is None else asset_data.credits
|
||||
is_free = credits == 0
|
||||
|
||||
# Configuration
|
||||
if asset_data.is_purchased:
|
||||
# If downloading and already purchased.
|
||||
title = _t("Choose Texture Size")
|
||||
else:
|
||||
title = asset_name
|
||||
|
||||
in_scene = False
|
||||
|
||||
sizes = asset_type_data.get_size_list(local_only=False)
|
||||
downloaded = asset_type_data.get_size_list(
|
||||
local_only=True,
|
||||
addon_convention=cTB._asset_index.addon_convention,
|
||||
local_convention=asset_data.get_convention(local=True))
|
||||
|
||||
key = get_search_key(
|
||||
tab=KEY_TAB_IMPORTED, search="", category_list=[CATEGORY_ALL])
|
||||
query_key = cTB._asset_index._query_key_to_tuple(
|
||||
key, chunk=-1, chunk_size=1000000)
|
||||
if query_key not in cTB._asset_index.cached_queries:
|
||||
# The request is probably still be in flight
|
||||
in_scene = False
|
||||
elif asset_id in cTB._asset_index.cached_queries[query_key]:
|
||||
in_scene = True
|
||||
|
||||
prefer_blend = cTB.settings["download_prefer_blend"]
|
||||
link_blend = cTB.link_blend_session
|
||||
|
||||
blend_exists = False
|
||||
fbx_exists = False
|
||||
if asset_type == AssetType.MODEL:
|
||||
blend_exists = asset_type_data.has_mesh(
|
||||
model_type=ModelType.BLEND,
|
||||
native_only=True,
|
||||
renderer=None) # None is for legacy cycles models w/o engine name
|
||||
fbx_exists = asset_type_data.has_mesh(
|
||||
model_type=ModelType.FBX,
|
||||
native_only=False,
|
||||
renderer="")
|
||||
|
||||
any_model = blend_exists or fbx_exists
|
||||
is_linked_blend_import = prefer_blend and link_blend and blend_exists
|
||||
|
||||
def _imported_model_extras(
|
||||
context, layout: bpy.types.UILayout) -> None:
|
||||
area = cTB.settings["area"]
|
||||
if area != KEY_TAB_IMPORTED or asset_type != AssetType.MODEL:
|
||||
return
|
||||
|
||||
op = layout.operator(
|
||||
"poliigon.poliigon_select",
|
||||
text=_t("Select"),
|
||||
icon="RESTRICT_SELECT_OFF",
|
||||
)
|
||||
op.mode = "model"
|
||||
op.data = asset_name
|
||||
op.tooltip = _t("{0}\n(Select all instances)").format(asset_name)
|
||||
|
||||
layout.separator()
|
||||
|
||||
@reporting.handle_draw()
|
||||
def draw(self, context):
|
||||
layout = self.layout
|
||||
|
||||
_imported_model_extras(context, layout)
|
||||
|
||||
# List the different resolution sizes to provide.
|
||||
if asset_data.is_purchased or is_free or cTB.is_unlimited_user():
|
||||
for size in sizes:
|
||||
if asset_type == AssetType.TEXTURE:
|
||||
draw_material_sizes(context, size, layout)
|
||||
elif asset_type == AssetType.MODEL:
|
||||
draw_model_sizes(context, size, layout)
|
||||
elif asset_type == AssetType.HDRI:
|
||||
draw_hdri_sizes(context, size, layout)
|
||||
else:
|
||||
label = _t("{0} not implemented yet").format(asset_type)
|
||||
layout.label(text=label)
|
||||
# TODO(SOFT-2421): Deactivated as it seems to have unwanted side effects.
|
||||
# else:
|
||||
# _draw_button_quick_preview(
|
||||
# cTB,
|
||||
# layout_row=layout,
|
||||
# asset_data=asset_data,
|
||||
# is_selection=True,
|
||||
# have_text_label=True
|
||||
# )
|
||||
# If else branch is activated, the following separator needs to be
|
||||
# outside if/else
|
||||
layout.separator()
|
||||
|
||||
op = layout.operator(
|
||||
"poliigon.open_preferences",
|
||||
text=_t("Open Import options in Preferences"),
|
||||
icon="PREFERENCES",
|
||||
)
|
||||
op.set_focus = "show_default_prefs"
|
||||
layout.separator()
|
||||
|
||||
# Always show view online and high res previews.
|
||||
if not hide_detail_view:
|
||||
# new detail viewer design is unstable on OSX and fails for awhile
|
||||
# for the remainder of the session. For consistency, we disable it
|
||||
# outright for all OSX users until a better solution is found.
|
||||
is_osx = bpy.app.build_platform.lower() == b"darwin"
|
||||
if bpy.app.version >= (4, 2) and not is_osx:
|
||||
op_name = "poliigon.detail_view_open"
|
||||
op_text = _t("View Asset Details")
|
||||
else:
|
||||
op_name = "poliigon.view_thumbnail"
|
||||
op_text = _t("View Large Preview")
|
||||
|
||||
op = layout.operator(
|
||||
op_name,
|
||||
text=op_text,
|
||||
icon="OUTLINER_OB_IMAGE",
|
||||
)
|
||||
op.asset_id = asset_data.asset_id
|
||||
|
||||
op = layout.operator(
|
||||
"poliigon.poliigon_link",
|
||||
text=_t("View online"),
|
||||
icon_value=cTB.ui_icons["ICON_poliigon"].icon_id,
|
||||
)
|
||||
op.mode = str(asset_id)
|
||||
op.tooltip = _t("View on Poliigon.com")
|
||||
|
||||
# If already local, support opening the folder location.
|
||||
if not downloaded:
|
||||
return
|
||||
|
||||
op = layout.operator(
|
||||
"poliigon.poliigon_folder",
|
||||
text=_t("Open folder location"),
|
||||
icon="FILE_FOLDER")
|
||||
op.asset_id = asset_id
|
||||
|
||||
# ... and provide option to sync with asset browser
|
||||
# TODO(Andreas): Asset Browser integration and AssetIndex
|
||||
# in_asset_browser = asset_data.get("in_asset_browser", False)
|
||||
in_asset_browser = asset_data.runtime.is_in_asset_browser()
|
||||
is_feature_avail = bpy.app.version >= (3, 0)
|
||||
missing_local_model = asset_type == AssetType.MODEL and not any_model
|
||||
if not is_feature_avail or missing_local_model:
|
||||
return
|
||||
|
||||
client_starting = cTB.lock_client_start.locked()
|
||||
layout.separator()
|
||||
row = layout.row()
|
||||
op = row.operator(
|
||||
"poliigon.update_asset_browser",
|
||||
text=_t("Synchronize with Asset Browser"),
|
||||
icon="FILE_REFRESH")
|
||||
op.asset_id = asset_id
|
||||
row.enabled = not in_asset_browser and not client_starting
|
||||
|
||||
def draw_material_sizes(
|
||||
context, size: str, layout: bpy.types.UILayout) -> None:
|
||||
"""Draw the menu row for a materials' single resolution size."""
|
||||
|
||||
row = layout.row()
|
||||
imported = f"{asset_name}_{size}" in bpy.data.materials
|
||||
|
||||
if asset_data.get_convention() >= 1:
|
||||
all_expected_maps_for_size = asset_type_data.all_expected_maps_local(
|
||||
cTB.user.map_preferences, size)
|
||||
else:
|
||||
all_expected_maps_for_size = size in downloaded
|
||||
|
||||
if imported or all_expected_maps_for_size:
|
||||
# Action: Load and apply it
|
||||
if imported:
|
||||
label = _t("{0} (apply material)").format(size)
|
||||
tip = _t("Apply {0} Material\n{1}").format(size, asset_name)
|
||||
elif context.selected_objects:
|
||||
label = _t("{0} (import + apply)").format(size)
|
||||
tip = _t("Apply {0} Material\n{1}").format(size, asset_name)
|
||||
else:
|
||||
label = _t("{0} (import)").format(size)
|
||||
tip = _t("Import {0} Material\n{1}").format(size, asset_name)
|
||||
|
||||
# If nothing is selected and this size is already importing,
|
||||
# then there's nothing to do.
|
||||
if imported and not context.selected_objects:
|
||||
row.enabled = False
|
||||
|
||||
op = row.operator(
|
||||
"poliigon.poliigon_material",
|
||||
text=label,
|
||||
icon="TRACKING_REFINE_BACKWARDS")
|
||||
# Order is relevant here. vType needs to be set before vSize!
|
||||
op.asset_id = asset_id
|
||||
safe_size_apply(cTB, op, size, asset_name)
|
||||
op.mapping = "UV"
|
||||
op.scale = 1.0
|
||||
op.use_16bit = cTB.settings["use_16"]
|
||||
op.reuse_material = True
|
||||
op.tooltip = tip
|
||||
set_op_mat_disp_strength(op, asset_name, op.mode_disp)
|
||||
else:
|
||||
# Action: Download
|
||||
# (for free assets this is purchase + implicit auto-download)
|
||||
if check_convention(asset_data):
|
||||
label = _t("{0} (download)").format(size)
|
||||
else:
|
||||
label = _t("{size} (Update needed)").format(size)
|
||||
row.enabled = False
|
||||
op = row.operator(
|
||||
"poliigon.poliigon_download",
|
||||
text=label,
|
||||
icon="IMPORT")
|
||||
op.asset_id = asset_id
|
||||
safe_size_apply(cTB, op, size, asset_name)
|
||||
if is_free and not asset_data.is_purchased:
|
||||
op.mode = "purchase"
|
||||
else:
|
||||
op.mode = "download"
|
||||
op.tooltip = _t("Download {0} Material\n{1}").format(
|
||||
size, asset_name)
|
||||
|
||||
def draw_model_sizes(
|
||||
context, size: str, layout: bpy.types.UILayout) -> None:
|
||||
"""Draw the menu row for a model's single resolution size."""
|
||||
row = layout.row()
|
||||
|
||||
if size in downloaded and any_model:
|
||||
# Action: Load and apply it
|
||||
lod, label, tip = get_model_op_details(
|
||||
cTB, asset_data, size)
|
||||
if is_linked_blend_import:
|
||||
label += _t(" (disable link .blend to import size)")
|
||||
|
||||
op = row.operator(
|
||||
"poliigon.poliigon_model",
|
||||
text=label,
|
||||
icon="TRACKING_REFINE_BACKWARDS")
|
||||
op.asset_id = asset_id
|
||||
safe_size_apply(cTB, op, size, asset_name)
|
||||
op.tooltip = tip
|
||||
op.lod = lod if len(lod) > 0 else "NONE"
|
||||
row.enabled = not is_linked_blend_import
|
||||
else:
|
||||
# Action: Download
|
||||
if check_convention(asset_data):
|
||||
label = _t("{0} (download)").format(size)
|
||||
else:
|
||||
label = _t("{0} (Update needed)").format(size)
|
||||
row.enabled = False
|
||||
op = row.operator(
|
||||
"poliigon.poliigon_download",
|
||||
text=label,
|
||||
icon="IMPORT")
|
||||
op.asset_id = asset_id
|
||||
safe_size_apply(cTB, op, size, asset_name)
|
||||
if is_free and not asset_data.is_purchased:
|
||||
op.mode = "purchase"
|
||||
else:
|
||||
op.mode = "download"
|
||||
op.tooltip = _t("Download {0} textures\n{1}").format(
|
||||
size, asset_name)
|
||||
|
||||
def draw_hdri_sizes(
|
||||
context, size: str, layout: bpy.types.UILayout) -> None:
|
||||
"""Draw the menu row for an HDRI's single resolution size."""
|
||||
row = layout.row()
|
||||
|
||||
size_light = ""
|
||||
if in_scene:
|
||||
image_name_light = asset_name + "_Light"
|
||||
if image_name_light in bpy.data.images.keys():
|
||||
path_light = bpy.data.images[image_name_light].filepath
|
||||
filename = os.path.basename(path_light)
|
||||
match_object = re.search(r"_(\d+K)[_\.]", filename)
|
||||
size_light = match_object.group(1) if match_object else cTB.settings["hdri"]
|
||||
|
||||
if size in downloaded:
|
||||
# Action: Load and apply it
|
||||
if size == size_light:
|
||||
label = _t("{0} (apply HDRI)").format(size)
|
||||
tip = _t("Apply {0} HDRI\n{1}").format(size, asset_name)
|
||||
else:
|
||||
label = _t("{0} (import HDRI)").format(size)
|
||||
tip = _t("Import {0} HDRI\n{1}").format(size, asset_name)
|
||||
|
||||
op = row.operator(
|
||||
"poliigon.poliigon_hdri",
|
||||
text=label,
|
||||
icon="TRACKING_REFINE_BACKWARDS")
|
||||
op.asset_id = asset_id
|
||||
safe_size_apply(cTB, op, size, asset_name)
|
||||
if cTB.settings["hdri_use_jpg_bg"]:
|
||||
size_bg = cTB.settings["hdrib"]
|
||||
size_bg = asset_type_data.get_size(
|
||||
size_bg,
|
||||
local_only=True,
|
||||
addon_convention=cTB.addon_convention,
|
||||
local_convention=asset_data.get_convention(local=True))
|
||||
op.size_bg = f"{size_bg}_JPG"
|
||||
else:
|
||||
op.size_bg = f"{size}_EXR"
|
||||
op.tooltip = tip
|
||||
|
||||
else:
|
||||
# Action: Download
|
||||
if check_convention(asset_data):
|
||||
label = _t("{0} (download)").format(size)
|
||||
else:
|
||||
label = _t("{0} (Update needed)").format(size)
|
||||
row.enabled = False
|
||||
op = row.operator(
|
||||
"poliigon.poliigon_download",
|
||||
text=label,
|
||||
icon="IMPORT")
|
||||
op.asset_id = asset_id
|
||||
safe_size_apply(cTB, op, size, asset_name)
|
||||
if is_free and not asset_data.is_purchased:
|
||||
op.mode = "purchase"
|
||||
else:
|
||||
op.mode = "download"
|
||||
op.tooltip = _t("Download {0}\n{1}").format(size, asset_name)
|
||||
|
||||
# Generate the popup menu.
|
||||
bpy.context.window_manager.popup_menu(draw, title=title, icon="QUESTION")
|
||||
@@ -0,0 +1,220 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from typing import Optional, Tuple
|
||||
|
||||
import bpy
|
||||
|
||||
from ..modules.poliigon_core.assets import (
|
||||
AssetData,
|
||||
ModelType)
|
||||
from ..modules.poliigon_core.multilingual import _t
|
||||
|
||||
from ..constants import SUPPORTED_CONVENTION
|
||||
from ..utils import construct_model_name
|
||||
|
||||
|
||||
def check_convention(asset_data: AssetData, local: bool = False) -> bool:
|
||||
asset_convention = asset_data.get_convention(local=local)
|
||||
|
||||
if asset_convention is None:
|
||||
return False
|
||||
elif asset_convention > SUPPORTED_CONVENTION:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def get_model_op_details(
|
||||
cTB, asset_data: AssetData, size: str) -> Tuple[str, str, str]:
|
||||
"""Get details to use in the ui for a given model and size."""
|
||||
|
||||
asset_type_data = asset_data.get_type_data()
|
||||
asset_name = asset_data.asset_name
|
||||
|
||||
default_lod = cTB.settings["lod"]
|
||||
downloaded = asset_type_data.get_size_list(
|
||||
local_only=True,
|
||||
addon_convention=cTB._asset_index.addon_convention,
|
||||
local_convention=asset_data.get_convention(local=True))
|
||||
|
||||
lod = asset_type_data.get_lod(default_lod)
|
||||
if lod is None:
|
||||
lod = "NONE"
|
||||
|
||||
if not asset_type_data.has_mesh(ModelType.FBX):
|
||||
lod = "NONE"
|
||||
|
||||
coll_name = construct_model_name(asset_name, size, lod)
|
||||
|
||||
coll = bpy.data.collections.get(coll_name)
|
||||
if coll:
|
||||
in_scene = True
|
||||
else:
|
||||
in_scene = False
|
||||
|
||||
label = ""
|
||||
tip = ""
|
||||
if size in downloaded:
|
||||
if in_scene:
|
||||
if lod:
|
||||
label = _t("{0} {1} (import again)").format(size, lod)
|
||||
tip = _t("Import {0} {1} again\n{2}").format(
|
||||
size, lod, asset_name)
|
||||
else:
|
||||
label = _t("{0} (import again)").format(size)
|
||||
tip = _t("Import {0} again\n{1}").format(size, asset_name)
|
||||
else:
|
||||
if lod:
|
||||
label = _t("{0} {1} (import)").format(size, lod)
|
||||
tip = _t("Import {0} {1}\n{2}").format(
|
||||
size, lod, asset_name)
|
||||
else:
|
||||
label = _t("{0} (import)").format(size)
|
||||
tip = _t("Import {0}\n{1}").format(size, asset_name)
|
||||
|
||||
return lod, label, tip
|
||||
|
||||
|
||||
def safe_size_apply(cTB,
|
||||
op_ref: bpy.types.OperatorProperties,
|
||||
size_value: str,
|
||||
asset_name: str) -> None:
|
||||
"""Applies a size value to operator draw with a safe fallback.
|
||||
|
||||
If we try to apply a size which is not recognized as local, it will fail
|
||||
and disrupt further drawing. This function mitigates this problem.
|
||||
"""
|
||||
try:
|
||||
op_ref.size = size_value
|
||||
except TypeError as e:
|
||||
# Since this is a UI draw issue, there will be multiple of these
|
||||
# these reports, but we have user-level debouncing for a max number
|
||||
# per message type.
|
||||
msg = f"Failed to assign {size_value} size for {asset_name}: {e}"
|
||||
cTB.logger_ui.error(msg)
|
||||
# TODO(SOFT-1303): Include in refactor to asset index, disabled
|
||||
# overreporting for now.
|
||||
# reporting.capture_message("failed_size_op_set", msg, "error")
|
||||
|
||||
|
||||
def check_dpi(cTB, force: bool = True) -> None:
|
||||
"""Checks the DPI of the screen to adjust the scale accordingly.
|
||||
|
||||
Used to ensure previews remain square and avoid text truncation.
|
||||
"""
|
||||
|
||||
if not force and cTB.ui_scale_checked:
|
||||
return
|
||||
|
||||
prefs = bpy.context.preferences
|
||||
cTB.settings["win_scale"] = prefs.system.ui_scale
|
||||
cTB.ui_scale_checked = True
|
||||
|
||||
|
||||
def get_ui_scale(cTB) -> float:
|
||||
"""Utility for fetching the ui scale, used in draw code."""
|
||||
|
||||
check_dpi(cTB)
|
||||
return cTB.settings["win_scale"]
|
||||
|
||||
|
||||
def _get_line_width(cTB, line: str) -> int:
|
||||
"""Returns pixel width of a string."""
|
||||
|
||||
width_line = 15
|
||||
for _char in line:
|
||||
if _char in "ABCDEFGHKLMNOPQRSTUVWXYZmw":
|
||||
width_line += 9
|
||||
elif _char in "abcdeghknopqrstuvxyz0123456789":
|
||||
width_line += 6
|
||||
elif _char in "IJfijl .":
|
||||
width_line += 3
|
||||
|
||||
width_line *= get_ui_scale(cTB)
|
||||
return width_line
|
||||
|
||||
|
||||
def wrapped_label(cTB,
|
||||
width: int,
|
||||
text: str,
|
||||
container: bpy.types.UILayout,
|
||||
icon: Optional[str] = None,
|
||||
add_padding: bool = False,
|
||||
add_padding_top: bool = False,
|
||||
add_padding_bottom: bool = False,
|
||||
) -> None:
|
||||
"""Text wrap a label based on indicated width."""
|
||||
|
||||
cTB.logger_ui.debug(f"wrapped_label width={width}, text={text}, "
|
||||
f"icon={icon}, add_padding={add_padding}")
|
||||
|
||||
list_words = [_word.replace("!@#", " ") for _word in text.split(" ")]
|
||||
|
||||
row = container.row()
|
||||
parent = row.column(align=True)
|
||||
parent.scale_y = 0.8 # To make vertical height more natural for text.
|
||||
|
||||
if add_padding or add_padding_top:
|
||||
parent.label(text="")
|
||||
|
||||
if icon is not None:
|
||||
width -= 25 * get_ui_scale(cTB)
|
||||
|
||||
line = ""
|
||||
first = True
|
||||
|
||||
for _word in list_words:
|
||||
width_line = _get_line_width(cTB, line + _word + " ")
|
||||
if width_line > width:
|
||||
if first:
|
||||
if icon is None:
|
||||
parent.label(text=line)
|
||||
else:
|
||||
parent.label(text=line, icon=icon)
|
||||
first = False
|
||||
|
||||
else:
|
||||
if icon is None:
|
||||
parent.label(text=line)
|
||||
else:
|
||||
parent.label(text=line, icon="BLANK1")
|
||||
|
||||
line = _word + " "
|
||||
|
||||
else:
|
||||
line += _word + " "
|
||||
|
||||
if line != "":
|
||||
if icon is None:
|
||||
parent.label(text=line)
|
||||
else:
|
||||
if first:
|
||||
parent.label(text=line, icon=icon)
|
||||
else:
|
||||
parent.label(text=line, icon="BLANK1")
|
||||
|
||||
if add_padding or add_padding_bottom:
|
||||
parent.label(text="")
|
||||
|
||||
|
||||
def separator_p4b(
|
||||
container: bpy.types.UILayout, *, line: bool = False) -> None:
|
||||
if line and bpy.app.version >= (4, 2):
|
||||
container.separator(type='LINE')
|
||||
else:
|
||||
container.separator()
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,200 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import bpy
|
||||
|
||||
# {<node.bl_static_type>: {<socket name used>: <socket name version>}}
|
||||
SOCKET_NAMES_2_8 = {
|
||||
"ADD_SHADER": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"BSDF_PRINCIPLED": {
|
||||
"Emission Color": "Emission",
|
||||
"Transmission Weight": "Transmission",
|
||||
},
|
||||
"MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"MIX": {
|
||||
"A": "Color1",
|
||||
"B": "Color2"
|
||||
},
|
||||
"MIX_RGB": {
|
||||
"Factor": "Fac",
|
||||
"A": "Color1",
|
||||
"B": "Color2",
|
||||
"Result": "Color"
|
||||
},
|
||||
"MIX_SHADER": {
|
||||
"A": 1,
|
||||
"B": 2,
|
||||
},
|
||||
"VECT_MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
}
|
||||
SOCKET_NAMES_3_0 = {
|
||||
"ADD_SHADER": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"BSDF_PRINCIPLED": {
|
||||
"Emission Color": "Emission",
|
||||
"Transmission Weight": "Transmission",
|
||||
},
|
||||
"MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"MIX": {
|
||||
"A": "Color1",
|
||||
"B": "Color2"
|
||||
},
|
||||
"MIX_RGB": {
|
||||
"Factor": "Fac",
|
||||
"A": "Color1",
|
||||
"B": "Color2",
|
||||
"Result": "Color"
|
||||
},
|
||||
"MIX_SHADER": {
|
||||
"A": 1, # Code needs allow_index to be True!
|
||||
"B": 2, # Code needs allow_index to be True!
|
||||
},
|
||||
"VECT_MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
}
|
||||
SOCKET_NAMES_3_4 = {
|
||||
"ADD_SHADER": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"BSDF_PRINCIPLED": {
|
||||
"Emission Color": "Emission",
|
||||
"Transmission Weight": "Transmission",
|
||||
},
|
||||
"MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"MIX": {
|
||||
"A": 6, # Code needs allow_index to be True!
|
||||
"B": 7, # Code needs allow_index to be True!
|
||||
"Result": 2 # Code needs allow_index to be True!
|
||||
},
|
||||
"MIX_RGB": {
|
||||
"Factor": "Fac",
|
||||
"A": "Color1",
|
||||
"B": "Color2",
|
||||
"Result": "Color"
|
||||
},
|
||||
"MIX_SHADER": {
|
||||
"A": 1, # Code needs allow_index to be True!
|
||||
"B": 2, # Code needs allow_index to be True!
|
||||
},
|
||||
"VECT_MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
}
|
||||
|
||||
SOCKET_NAMES_4_0 = {
|
||||
"ADD_SHADER": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"MIX_RGB": {
|
||||
"Factor": "Fac",
|
||||
"A": "Color1",
|
||||
"B": "Color2",
|
||||
"Result": "Color"
|
||||
},
|
||||
"MIX_SHADER": {
|
||||
"A": 1, # Code needs allow_index to be True!
|
||||
"B": 2, # Code needs allow_index to be True!
|
||||
},
|
||||
"VECT_MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
}
|
||||
SOCKET_NAMES_4_3 = {
|
||||
"ADD_SHADER": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
"MIX_SHADER": {
|
||||
"A": 1, # Code needs allow_index to be True!
|
||||
"B": 2, # Code needs allow_index to be True!
|
||||
},
|
||||
"VECT_MATH": {
|
||||
"A": 0, # Code needs allow_index to be True!
|
||||
"B": 1, # Code needs allow_index to be True!
|
||||
},
|
||||
}
|
||||
|
||||
SOCKET_NAMES = {
|
||||
(2, 8): SOCKET_NAMES_2_8,
|
||||
(3, 0): SOCKET_NAMES_3_0,
|
||||
(3, 4): SOCKET_NAMES_3_4,
|
||||
(4, 0): SOCKET_NAMES_4_0,
|
||||
(4, 3): SOCKET_NAMES_4_3,
|
||||
(999, 999): None # upper version bound, used during iteration
|
||||
}
|
||||
|
||||
|
||||
def get_socket_name(node: bpy.types.Node, sock_name: str) -> str:
|
||||
"""Returns a socket name for the running Blender version."""
|
||||
|
||||
ver_blender = bpy.app.version
|
||||
|
||||
list_versions = list(SOCKET_NAMES.keys())
|
||||
zip_versions = zip(list_versions[:-1], list_versions[1:])
|
||||
socket_names = None
|
||||
for _version_low, _version_high in zip_versions:
|
||||
if _version_low <= ver_blender < _version_high:
|
||||
socket_names = SOCKET_NAMES[_version_low]
|
||||
break
|
||||
|
||||
if socket_names is None:
|
||||
print(f"No socket name table found for Blender version {ver_blender}")
|
||||
return sock_name
|
||||
|
||||
if node.bl_static_type not in socket_names:
|
||||
# Not an error, just a version independent port name
|
||||
return sock_name
|
||||
|
||||
socket_names_node = socket_names[node.bl_static_type]
|
||||
|
||||
if sock_name not in socket_names_node:
|
||||
# Not an error, just a version independent port name
|
||||
return sock_name
|
||||
|
||||
return socket_names_node[sock_name]
|
||||
@@ -0,0 +1,586 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import os
|
||||
import re
|
||||
from typing import List, Optional, Tuple, Union
|
||||
|
||||
import bpy
|
||||
|
||||
from .modules.poliigon_core.assets import (AssetData,
|
||||
AssetType,
|
||||
SIZES)
|
||||
from .material_import_cycles_port_names import get_socket_name
|
||||
from . import reporting
|
||||
from .utils import compare_simple_property_group
|
||||
|
||||
|
||||
ASSET_TYPE_TO_IMPORTED_TYPE = {
|
||||
AssetType.TEXTURE: "Textures",
|
||||
AssetType.MODEL: "Models",
|
||||
AssetType.HDRI: "HRDIs",
|
||||
AssetType.BRUSH: "Brushes",
|
||||
AssetType.ALL: "All Assets"
|
||||
}
|
||||
|
||||
|
||||
def find_identical_material(asset_data: AssetData,
|
||||
size: str,
|
||||
mapping: str,
|
||||
scale: float,
|
||||
displacement: float,
|
||||
use_16bit: bool,
|
||||
mode_disp: str
|
||||
) -> bpy.types.Material:
|
||||
"""Tries to find an parameter-wise identical material in current scene."""
|
||||
|
||||
asset_name = asset_data.asset_name
|
||||
asset_type = asset_data.asset_type
|
||||
is_backplate = asset_data.is_backplate()
|
||||
asset_type_imported = ASSET_TYPE_TO_IMPORTED_TYPE[asset_type]
|
||||
|
||||
identical_mat = None
|
||||
for mat in bpy.data.materials:
|
||||
if not mat.poliigon_props.asset_name.startswith(asset_name):
|
||||
continue
|
||||
if mat.poliigon_props.asset_type != asset_type_imported:
|
||||
continue
|
||||
if mat.poliigon_props.size != size:
|
||||
continue
|
||||
if mat.poliigon_props.mapping != mapping:
|
||||
continue
|
||||
if mat.poliigon_props.scale != scale:
|
||||
continue
|
||||
if mat.poliigon_props.displacement != displacement:
|
||||
continue
|
||||
if mat.poliigon_props.use_16bit != use_16bit:
|
||||
continue
|
||||
if mat.poliigon_props.mode_disp != mode_disp:
|
||||
continue
|
||||
if mat.poliigon_props.is_backplate != is_backplate:
|
||||
continue
|
||||
if not compare_simple_property_group(
|
||||
bpy.context.window_manager.polligon_map_prefs,
|
||||
mat.poliigon_props.map_prefs):
|
||||
continue
|
||||
identical_mat = mat
|
||||
break
|
||||
return identical_mat
|
||||
|
||||
|
||||
def get_all_nodes(node_tree: bpy.types.NodeTree):
|
||||
nodes = list(node_tree.nodes)
|
||||
for node in node_tree.nodes:
|
||||
if node.bl_idname != "ShaderNodeGroup":
|
||||
continue
|
||||
elif not node.node_tree:
|
||||
continue
|
||||
nodes.extend(get_all_nodes(node.node_tree))
|
||||
return nodes
|
||||
|
||||
|
||||
def get_node_by_type(
|
||||
group: bpy.types.Node, bl_idname: str) -> bpy.types.Node:
|
||||
"""Returns first node of given type (bl_idname) found in group."""
|
||||
|
||||
node_found = None
|
||||
for _node in group.node_tree.nodes:
|
||||
if _node.bl_idname != bl_idname:
|
||||
continue
|
||||
node_found = _node
|
||||
break
|
||||
return node_found
|
||||
|
||||
|
||||
def get_node_by_name(group: bpy.types.Node, name: str) -> bpy.types.Node:
|
||||
"""Returns first node with given name found in group."""
|
||||
|
||||
node_found = None
|
||||
for _node in group.node_tree.nodes:
|
||||
if _node.name != name:
|
||||
continue
|
||||
node_found = _node
|
||||
break
|
||||
return node_found
|
||||
|
||||
|
||||
def get_all_node_trees(node_tree: bpy.types.NodeTree,
|
||||
include_root: bool = True):
|
||||
node_trees = [node_tree] if include_root else []
|
||||
for node in node_tree.nodes:
|
||||
if node.bl_idname != "ShaderNodeGroup":
|
||||
continue
|
||||
elif not node.node_tree:
|
||||
continue
|
||||
node_trees.extend(get_all_node_trees(node.node_tree))
|
||||
return node_trees
|
||||
|
||||
|
||||
def mat_get_nodes(mat: bpy.types.Material,
|
||||
node_idname: str = "ShaderNodeTexImage"):
|
||||
if mat is None:
|
||||
return []
|
||||
|
||||
nodes = get_all_nodes(mat.node_tree)
|
||||
|
||||
tex_nodes = [
|
||||
node for node in nodes
|
||||
if node.bl_idname == node_idname
|
||||
]
|
||||
return tex_nodes
|
||||
|
||||
|
||||
def regex_size_rename(name_old: str, size_new: str) -> str:
|
||||
"""Returns a name with new_size, if a size is found in name_old"""
|
||||
|
||||
# Match in order an underscore, digit number (also multiple digits),
|
||||
# immediately followed by K
|
||||
# group(1) contains the digit number (size) we are interested in.
|
||||
# Capturing group example: "whatever_4K" => 4
|
||||
name_new = name_old
|
||||
match_object = re.search(r"_(\d+K)", name_old)
|
||||
if match_object is not None:
|
||||
size_old = match_object.group(1)
|
||||
name_new = name_old.replace(size_old, size_new)
|
||||
return name_new
|
||||
|
||||
|
||||
def rename_material_and_nodes(mat: bpy.types.Material,
|
||||
size: str) -> None:
|
||||
# Rename material, first
|
||||
mat.name = regex_size_rename(mat.name, size)
|
||||
# Then rename all group nodes containing size in name
|
||||
nodes = get_all_nodes(mat.node_tree)
|
||||
for _node in nodes:
|
||||
if _node.bl_idname != "ShaderNodeGroup":
|
||||
continue
|
||||
_node.name = regex_size_rename(_node.name, size)
|
||||
# Finally rename all node trees containing size in name
|
||||
node_trees = get_all_node_trees(
|
||||
mat.node_tree, include_root=False)
|
||||
for _node_tree in node_trees:
|
||||
_node_tree.name = regex_size_rename(
|
||||
_node_tree.name, size)
|
||||
|
||||
|
||||
def replace_tex_size(materials: List,
|
||||
asset_files: List[str],
|
||||
size: str,
|
||||
link_blend: bool
|
||||
) -> None:
|
||||
"""Changes the texture resolution of all materials in list."""
|
||||
|
||||
if link_blend:
|
||||
return
|
||||
|
||||
for mat in materials:
|
||||
tex_nodes = mat_get_nodes(
|
||||
mat, node_idname="ShaderNodeTexImage")
|
||||
replaced_tex = False
|
||||
for node in tex_nodes:
|
||||
if node is None or node.image is None:
|
||||
continue
|
||||
|
||||
# Match in order an underscore, digit number (also multiple
|
||||
# digits), immediately followed by K,
|
||||
# followed by an underscore or a period.
|
||||
# group(1) contains the digit number we are interested in.
|
||||
# Capturing group examples: "_4K." or "_16K_METALLIC"
|
||||
path_tex = node.image.filepath
|
||||
match_object = re.search(r"_(\d+K)[_\.]", path_tex)
|
||||
dir_parent = os.path.basename(os.path.dirname(path_tex))
|
||||
if match_object is not None:
|
||||
imported_size = match_object.group(1)
|
||||
elif "HIRES" in node.image.filepath:
|
||||
imported_size = "HIRES"
|
||||
elif dir_parent in SIZES:
|
||||
imported_size = dir_parent
|
||||
else:
|
||||
# TODO(Andreas): Need logger, here
|
||||
print("Invalid filepath for parsing", node.image.filepath)
|
||||
continue
|
||||
if imported_size == size:
|
||||
continue
|
||||
|
||||
directory, filename = os.path.split(node.image.filepath)
|
||||
filename_desired_size = filename.replace(imported_size, size)
|
||||
directory_desired_size = directory.replace(imported_size, size)
|
||||
path_desired_size = os.path.join(
|
||||
directory_desired_size, filename_desired_size)
|
||||
path_found = None
|
||||
for path_asset_file in asset_files:
|
||||
if path_asset_file == path_desired_size:
|
||||
path_found = path_asset_file
|
||||
break
|
||||
if path_found is not None:
|
||||
node.image.filepath = path_found
|
||||
node.image.name = os.path.basename(path_found)
|
||||
replaced_tex = True
|
||||
# Finally also change the material name to the new size
|
||||
if replaced_tex:
|
||||
rename_material_and_nodes(mat, size)
|
||||
|
||||
|
||||
def print_node_inputs_outputs(node: bpy.types.Node) -> None:
|
||||
"""Prints input and output ports of a node with their names and data
|
||||
type.
|
||||
"""
|
||||
|
||||
print(f"Node: {node.name}")
|
||||
print("Inputs:")
|
||||
for idx, _in in enumerate(node.inputs):
|
||||
print(" ", idx, _in.name, _in.type)
|
||||
print("Outputs:")
|
||||
for idx, _out in enumerate(node.outputs):
|
||||
print(" ", idx, _out.name, _out.type)
|
||||
|
||||
|
||||
def print_node_socket(
|
||||
node: bpy.types.Node,
|
||||
sock: bpy.types.NodeSocket,
|
||||
addressed: Union[str, int]
|
||||
) -> None:
|
||||
"""Prints some information about a node's socket."""
|
||||
|
||||
print("NODE: ", node.name, node.type, node.bl_static_type)
|
||||
if sock.is_output:
|
||||
print(" OUT addressed: ", addressed)
|
||||
idx = list(node.outputs).index(sock)
|
||||
else:
|
||||
print(" IN addressed: ", addressed)
|
||||
idx = list(node.inputs).index(sock)
|
||||
print(" idx: ", idx)
|
||||
print(" bl_idname: ", sock.bl_idname)
|
||||
# print(" bl_label: ", sock.bl_label)
|
||||
print(" identifier:", sock.identifier)
|
||||
print(" label: ", sock.label)
|
||||
print(" name: ", sock.name)
|
||||
|
||||
|
||||
def load_poliigon_node_group(node_type: str) -> bpy.types.Node:
|
||||
"""Loads the needed node group from template, if not already local."""
|
||||
|
||||
if node_type in bpy.data.node_groups.keys():
|
||||
return bpy.data.node_groups[node_type]
|
||||
|
||||
dir_script = os.path.join(os.path.dirname(__file__), "files")
|
||||
path_template = os.path.join(dir_script,
|
||||
"poliigon_material_template.blend")
|
||||
|
||||
if not os.path.exists(path_template):
|
||||
msg = f"Material template file missing!\n{path_template}"
|
||||
reporting.capture_message(
|
||||
"add_converter_node_no_template", msg, "error")
|
||||
return None
|
||||
|
||||
nodes_before = list(bpy.data.node_groups)
|
||||
|
||||
with bpy.data.libraries.load(path_template, link=False) as (from_file,
|
||||
into):
|
||||
into.node_groups = [
|
||||
node_group for node_group in from_file.node_groups
|
||||
if node_group in [node_type]
|
||||
]
|
||||
|
||||
nodes_after = list(bpy.data.node_groups)
|
||||
# Safely get the newly imported datablock, without referencing by name.
|
||||
nodes_imported = list(set(nodes_after) - set(nodes_before))
|
||||
if len(nodes_imported) == 0:
|
||||
raise RuntimeError("No new node groups imported")
|
||||
elif len(nodes_imported) > 1:
|
||||
# Not supposed to occur
|
||||
# TODO(Andreas): Need logger, here
|
||||
print("Warning, more than one??")
|
||||
node_mosaic = nodes_imported[0] # but just return first if more than one
|
||||
node_mosaic.name = node_type # pass in UI friendly name
|
||||
return node_mosaic
|
||||
|
||||
|
||||
def filter_textures_by_workflow(textures: List[str],
|
||||
size: str,
|
||||
name_mat: str
|
||||
) -> Tuple[List[str], bool]:
|
||||
|
||||
def parent_dir_name(path: str) -> str:
|
||||
return os.path.basename(os.path.dirname(path))
|
||||
|
||||
def filename_no_ext(path: str) -> str:
|
||||
return os.path.splitext(os.path.basename(path))[0]
|
||||
|
||||
textures_metallic = [
|
||||
tex
|
||||
for tex in textures
|
||||
if filename_no_ext(tex).endswith("METALNESS") or parent_dir_name(tex) == "METALNESS"
|
||||
]
|
||||
textures_specular = [
|
||||
tex
|
||||
for tex in textures
|
||||
if filename_no_ext(tex).endswith("SPECULAR") or parent_dir_name(tex) == "SPECULAR"
|
||||
]
|
||||
textures_dielectric = [
|
||||
tex
|
||||
for tex in textures
|
||||
if tex not in textures_metallic and tex not in textures_specular
|
||||
]
|
||||
textures_overlay = [
|
||||
tex
|
||||
for tex in textures
|
||||
if "OVERLAY" in os.path.splitext(os.path.basename(tex))[0]
|
||||
]
|
||||
|
||||
has_col_or_alpha = False
|
||||
for tex in textures:
|
||||
filename = os.path.splitext(os.path.basename(tex))[0]
|
||||
has_col = "COL" in filename
|
||||
has_alpha = "ALPHA" in filename
|
||||
if has_col or has_alpha:
|
||||
has_col_or_alpha = True
|
||||
break
|
||||
|
||||
only_overlay = False
|
||||
# TODO(Andreas): Dear reviewer, before refactoring, below if statement
|
||||
# had this additional condition:
|
||||
# and len(textures_overlay) <= len(textures)
|
||||
# Seeing how textures_overlay is generated above,
|
||||
# it is always true, isn't it?
|
||||
if not has_col_or_alpha and len(textures_overlay) > 0:
|
||||
# This is an overlay, not a full texture.
|
||||
only_overlay = True
|
||||
textures_workflow = textures
|
||||
elif len(textures_metallic) >= 4:
|
||||
textures_workflow = textures_metallic + textures_dielectric
|
||||
elif len(textures_specular) >= 4:
|
||||
textures_workflow = textures_specular + textures_dielectric
|
||||
elif len(textures_dielectric) >= 4:
|
||||
textures_workflow = textures_dielectric
|
||||
elif size == "PREVIEW":
|
||||
textures_workflow = textures
|
||||
elif has_col_or_alpha and len(textures_dielectric) > 0:
|
||||
# Likely decals or seafoam, which only have color information
|
||||
# but don't have OVERLAY as a map pass (only COL or ALPHAMASKED).
|
||||
textures_workflow = textures_dielectric
|
||||
elif has_col_or_alpha and len(textures_metallic) > 0:
|
||||
# Likely remastered asset with too few metalness textures
|
||||
textures_workflow = textures_metallic
|
||||
else:
|
||||
msg = (
|
||||
f"Wrong tex counts for {name_mat} to determine workflow - "
|
||||
f"metal:{len(textures_metallic)}, "
|
||||
f"specular:{len(textures_specular)}, "
|
||||
f"dielectric:{len(textures_dielectric)}"
|
||||
)
|
||||
reporting.capture_message(
|
||||
"build_mat_error_workflow", msg, "error")
|
||||
return None, only_overlay
|
||||
return textures_workflow, only_overlay
|
||||
|
||||
|
||||
def get_socket(
|
||||
*,
|
||||
node: bpy.types.Node,
|
||||
sock_name: str,
|
||||
sock_bl_idname_expected: str,
|
||||
is_output: bool = True
|
||||
) -> Optional[bpy.types.NodeSocket]:
|
||||
"""Returns a socket of a given node.
|
||||
|
||||
Compared to standard access, this function enforces socket reference by
|
||||
name, instead of int and additionally checks the socket being of expected
|
||||
type.
|
||||
"""
|
||||
if type(sock_name) is not str:
|
||||
msg = ("get_socket: For increased cross version compatibility index "
|
||||
"port addressing is no longer allowed. Use bl_idname instead!"
|
||||
f"{node.name}/{node.bl_idname} Name: {sock_name}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_socket", msg, "error")
|
||||
return None
|
||||
|
||||
if is_output:
|
||||
socket_list = node.outputs
|
||||
else:
|
||||
socket_list = node.inputs
|
||||
|
||||
sock_name = get_socket_name(node, sock_name)
|
||||
if sock_name not in socket_list:
|
||||
msg = ("get_socket: Socket Name not found "
|
||||
f"{node.name}/{node.bl_idname} Name: {sock_name}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_socket", msg, "error")
|
||||
return None
|
||||
|
||||
return socket_list[sock_name]
|
||||
|
||||
|
||||
def create_link(
|
||||
*,
|
||||
node_tree,
|
||||
node_out: bpy.types.Node,
|
||||
sock_out_name: str,
|
||||
sock_out_bl_idname_expected: str,
|
||||
node_in: bpy.types.Node,
|
||||
sock_in_name: str,
|
||||
sock_in_bl_idname_expected: str,
|
||||
allow_index: bool = False
|
||||
) -> None:
|
||||
"""Creates a link between an output and an input socket.
|
||||
|
||||
Compared to link creation, this function uses our port name table to find
|
||||
the correct name for the running Blender version and enforces socket
|
||||
reference by name, instead of int. Also additionally checks the socket
|
||||
being of expected type for some increased safety against node changes.
|
||||
|
||||
Optionally index reference can still be allowed and is needed for nodes
|
||||
like e.g. MIX or MATH (see port name tables SOCKET_NAMES), where named
|
||||
reference is not possible, since all input sockets have identical names...
|
||||
"""
|
||||
|
||||
if not allow_index and type(sock_out_name) is not str:
|
||||
msg = ("create_link: For increased cross version compatibility index "
|
||||
"port addressing is no longer allowed. Use bl_idname instead!"
|
||||
f"{node_out.name}/{node_out.bl_idname}: Name: {sock_out_name}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_link", msg, "error")
|
||||
return
|
||||
if not allow_index and type(sock_in_name) is not str:
|
||||
msg = ("create_link: For increased cross version compatibility index "
|
||||
"port addressing is no longer allowed. Use bl_idname instead!"
|
||||
f"{node_in.name}/{node_in.bl_idname}: Name: {sock_in_name}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_link", msg, "error")
|
||||
return
|
||||
|
||||
sock_out_name = get_socket_name(node_out, sock_out_name)
|
||||
sock_in_name = get_socket_name(node_in, sock_in_name)
|
||||
|
||||
if type(sock_out_name) is str and sock_out_name not in node_out.outputs:
|
||||
msg = ("create_link_nodes: Output Name not found "
|
||||
f"{node_out.name}/{node_out.bl_idname}: Name: {sock_out_name}\n"
|
||||
f" Available: {node_out.outputs.keys()}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_link", msg, "error")
|
||||
return
|
||||
if type(sock_in_name) is str and sock_in_name not in node_in.inputs:
|
||||
msg = ("create_link_nodes: Input Name not found "
|
||||
f"{node_in.name}/{node_in.bl_idname}: Name: {sock_in_name}\n"
|
||||
f" Available: {node_in.inputs.keys()}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_link", msg, "error")
|
||||
return
|
||||
|
||||
sock_out = node_out.outputs[sock_out_name]
|
||||
if sock_out.bl_idname != sock_out_bl_idname_expected:
|
||||
msg = ("create_link_nodes: Wrong output port type "
|
||||
f"{node_out.name}/{node_out.bl_idname}/{sock_out_name}: "
|
||||
f"{sock_out.bl_idname} != {sock_out_bl_idname_expected}\n"
|
||||
f" Available ports: {node_out.outputs.keys()}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_link", msg, "error")
|
||||
return
|
||||
sock_in = node_in.inputs[sock_in_name]
|
||||
if sock_in.bl_idname != sock_in_bl_idname_expected:
|
||||
msg = ("create_link_nodes: Wrong input port type "
|
||||
f"{node_in.name}/{node_in.bl_idname}/{sock_in_name}: "
|
||||
f"{sock_in.bl_idname} != {sock_in_bl_idname_expected}\n"
|
||||
f" Available ports: {node_in.inputs.keys()}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_link", msg, "error")
|
||||
return
|
||||
|
||||
node_tree.links.new(sock_out, sock_in)
|
||||
|
||||
|
||||
def create_link_sock_out(
|
||||
*,
|
||||
node_tree,
|
||||
sock_out: bpy.types.NodeSocket,
|
||||
node_in: bpy.types.Node,
|
||||
sock_in_name: str,
|
||||
sock_in_bl_idname_expected: str,
|
||||
allow_index: bool = False
|
||||
) -> None:
|
||||
"""Creates a link between an output and an input socket.
|
||||
|
||||
Special case version of above create_link(), which allows to
|
||||
pass in the output socket directly, while still keeping the advantages
|
||||
of create_link().
|
||||
"""
|
||||
|
||||
node_out = sock_out.node
|
||||
sock_out_name = sock_out.name
|
||||
sock_out_bl_idname_expected = sock_out.bl_idname
|
||||
|
||||
create_link(
|
||||
node_tree=node_tree,
|
||||
node_in=node_in,
|
||||
sock_in_name=sock_in_name,
|
||||
sock_in_bl_idname_expected=sock_in_bl_idname_expected,
|
||||
node_out=node_out,
|
||||
sock_out_name=sock_out_name,
|
||||
sock_out_bl_idname_expected=sock_out_bl_idname_expected,
|
||||
allow_index=allow_index
|
||||
)
|
||||
|
||||
|
||||
def set_value(
|
||||
*,
|
||||
node: bpy.types.Node,
|
||||
sock_name: str,
|
||||
sock_bl_idname_expected: str,
|
||||
value: any,
|
||||
allow_index: bool = False
|
||||
) -> None:
|
||||
"""Sets the value of a node's input socket.
|
||||
|
||||
Compared to setting the value directly, this function uses our port name
|
||||
table to find the correct name for the running Blender version and enforces
|
||||
socket reference by name, instead of int. Also additionally checks the
|
||||
socket being of expected type for some increased safety against node
|
||||
changes.
|
||||
|
||||
Optionally index reference can still be allowed and is needed for nodes
|
||||
like e.g. MIX or MATH (see port name tables SOCKET_NAMES), where named
|
||||
reference is not possible, since all input sockets have identical names...
|
||||
"""
|
||||
|
||||
if not allow_index and type(sock_name) is not str:
|
||||
msg = ("set_value: For increased cross version compatibility index "
|
||||
"port addressing is no longer allowed. Use bl_idname instead!"
|
||||
f"{node.name}/{node.bl_idname}: Name: {sock_name}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_value", msg, "error")
|
||||
return
|
||||
|
||||
sock_name = get_socket_name(node, sock_name)
|
||||
if type(sock_name) is str and sock_name not in node.inputs:
|
||||
msg = ("set_value: Input Name not found "
|
||||
f"{node.name}/{node.bl_idname}: Name: {sock_name}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_value", msg, "error")
|
||||
return
|
||||
|
||||
sock = node.inputs[sock_name]
|
||||
if sock.bl_idname != sock_bl_idname_expected:
|
||||
msg = ("set_value: Wrong input port type "
|
||||
f"{node.name}/{node.bl_idname}/{sock_name}: "
|
||||
f"{sock.bl_idname} != {sock_bl_idname_expected}")
|
||||
print(msg)
|
||||
reporting.capture_message("import_node_value", msg, "error")
|
||||
return
|
||||
|
||||
sock.default_value = value
|
||||
@@ -0,0 +1,768 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import mathutils
|
||||
from typing import Optional
|
||||
|
||||
import bpy
|
||||
|
||||
from .material_import_utils import (
|
||||
load_poliigon_node_group,
|
||||
set_value)
|
||||
|
||||
|
||||
def create_node(
|
||||
group: bpy.types.Node,
|
||||
bl_idname: str,
|
||||
parent: Optional[bpy.types.Node],
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
select: bool = False,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates an arbitrary node of type bl_idname."""
|
||||
|
||||
node = group.node_tree.nodes.new(bl_idname)
|
||||
if name is not None:
|
||||
node.label = name
|
||||
node.name = name
|
||||
if parent is not None:
|
||||
node.parent = parent
|
||||
if location is not None:
|
||||
node.location = location
|
||||
if width is not None:
|
||||
node.width = width
|
||||
if height is not None:
|
||||
node.height = height
|
||||
node.select = select
|
||||
node.hide = hide
|
||||
return node
|
||||
|
||||
|
||||
def create_node_socket(
|
||||
node_group: bpy.types.Node,
|
||||
*,
|
||||
socket_type: str = "NodeSocketVector",
|
||||
in_out: str = "INPUT",
|
||||
name: str = "Vector",
|
||||
description: str = ""
|
||||
) -> None:
|
||||
"""Creates a new input or output socket on a group node."""
|
||||
|
||||
if bpy.app.version >= (4, 0):
|
||||
node_group.node_tree.interface.new_socket(
|
||||
name,
|
||||
description=description,
|
||||
in_out=in_out,
|
||||
socket_type=socket_type,
|
||||
parent=None
|
||||
)
|
||||
elif bpy.app.version >= (3, 4):
|
||||
if in_out == "INPUT":
|
||||
node_group.node_tree.inputs.new(socket_type, name)
|
||||
else:
|
||||
node_group.node_tree.outputs.new(socket_type, name)
|
||||
else:
|
||||
if in_out == "INPUT":
|
||||
node_group.inputs.new(socket_type, name)
|
||||
else:
|
||||
node_group.outputs.new(socket_type, name)
|
||||
|
||||
|
||||
def create_frame(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node],
|
||||
*,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates an arbitrary node of type bl_idname."""
|
||||
|
||||
frame = create_node(
|
||||
group=group,
|
||||
bl_idname="NodeFrame",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
return frame
|
||||
|
||||
|
||||
def create_add_shader_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates an 'Add Shader' node."""
|
||||
|
||||
node_add_shader = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeAddShader",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
return node_add_shader
|
||||
|
||||
|
||||
def create_color_invert_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
factor: Optional[float] = 1.0,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates an 'Color Invert' node."""
|
||||
|
||||
node_invert_color = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeInvert",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if factor is not None:
|
||||
set_value(
|
||||
node=node_invert_color,
|
||||
sock_name="Fac",
|
||||
sock_bl_idname_expected="NodeSocketFloatFactor",
|
||||
value=factor)
|
||||
return node_invert_color
|
||||
|
||||
|
||||
def create_combine_xyz_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
value_x: Optional[float] = 1.0,
|
||||
value_y: Optional[float] = 1.0,
|
||||
value_z: Optional[float] = 1.0,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates an 'Combine XYZ' node."""
|
||||
|
||||
node_combine_xyz = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeCombineXYZ",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if value_x is not None:
|
||||
set_value(
|
||||
node=node_combine_xyz,
|
||||
sock_name="X",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=value_x)
|
||||
if value_y is not None:
|
||||
set_value(
|
||||
node=node_combine_xyz,
|
||||
sock_name="Y",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=value_y)
|
||||
if value_z is not None:
|
||||
set_value(
|
||||
node=node_combine_xyz,
|
||||
sock_name="Z",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=value_z)
|
||||
return node_combine_xyz
|
||||
|
||||
|
||||
def create_displacement_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
midlevel: Optional[float] = 0.0,
|
||||
scale: Optional[float] = 0.0,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Displacement' node."""
|
||||
|
||||
node_displacement = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeDisplacement",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if midlevel is not None:
|
||||
set_value(
|
||||
node=node_displacement,
|
||||
sock_name="Midlevel",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=midlevel)
|
||||
if scale is not None:
|
||||
set_value(
|
||||
node=node_displacement,
|
||||
sock_name="Scale",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=scale)
|
||||
return node_displacement
|
||||
|
||||
|
||||
def create_fresnel_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
ior: Optional[float] = 1.150,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Fresnel' node."""
|
||||
|
||||
node_fresnel = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeFresnel",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if ior is not None:
|
||||
set_value(
|
||||
node=node_fresnel,
|
||||
sock_name="IOR",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=ior)
|
||||
return node_fresnel
|
||||
|
||||
|
||||
def create_group_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
node_tree: Optional[bpy.types.NodeTree] = None,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Group' node."""
|
||||
|
||||
node_group = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeGroup",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if node_tree is not None:
|
||||
node_group.node_tree = node_tree
|
||||
else:
|
||||
node_tree = bpy.data.node_groups.new(name, "ShaderNodeTree")
|
||||
node_group.node_tree = node_tree
|
||||
|
||||
node_inputs = node_group.node_tree.nodes.new("NodeGroupInput")
|
||||
node_inputs.select = False
|
||||
node_outputs = node_group.node_tree.nodes.new("NodeGroupOutput")
|
||||
node_outputs.select = False
|
||||
return node_group
|
||||
|
||||
|
||||
def create_mapping_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
scale: float = 1.0,
|
||||
name: Optional[str] = None,
|
||||
location: mathutils.Vector = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False,
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Mapping' node."""
|
||||
|
||||
node_mapping = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeMapping",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
set_value(
|
||||
node=node_mapping,
|
||||
sock_name="Scale",
|
||||
sock_bl_idname_expected="NodeSocketVectorXYZ",
|
||||
value=[scale] * 3)
|
||||
return node_mapping
|
||||
|
||||
|
||||
def create_math_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
operation: Optional[str] = "MULTIPLY",
|
||||
use_clamp: Optional[bool] = True,
|
||||
value1: Optional[float] = None,
|
||||
value2: Optional[float] = None,
|
||||
name: Optional[str] = None,
|
||||
location: mathutils.Vector = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False,
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Math' node."""
|
||||
|
||||
node_math = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeMath",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if operation is not None:
|
||||
node_math.operation = operation
|
||||
if use_clamp is not None:
|
||||
node_math.use_clamp = use_clamp
|
||||
|
||||
if value1 is not None:
|
||||
set_value(
|
||||
node=node_math,
|
||||
sock_name="A",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=value1,
|
||||
# Math node's input ports have identical names
|
||||
allow_index=True)
|
||||
if value2 is not None:
|
||||
set_value(
|
||||
node=node_math,
|
||||
sock_name="B",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=value2,
|
||||
# Math node's input ports have identical names
|
||||
allow_index=True)
|
||||
return node_math
|
||||
|
||||
|
||||
def create_mix_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
data_type: Optional[str] = "RGBA",
|
||||
use_clamp: Optional[bool] = True,
|
||||
clamp_result: Optional[bool] = False,
|
||||
blend_type: Optional[str] = "MULTIPLY",
|
||||
blend_factor: Optional[float] = None,
|
||||
name: Optional[str] = None,
|
||||
location: mathutils.Vector = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Mix' (or 'MixRGB') node."""
|
||||
|
||||
if bpy.app.version >= (3, 4):
|
||||
bl_idname = "ShaderNodeMix"
|
||||
else:
|
||||
bl_idname = "ShaderNodeMixRGB"
|
||||
node_mix = create_node(
|
||||
group=group,
|
||||
bl_idname=bl_idname,
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if data_type is not None:
|
||||
if bpy.app.version >= (3, 4):
|
||||
node_mix.data_type = "RGBA"
|
||||
if use_clamp is not None:
|
||||
if bpy.app.version >= (3, 4):
|
||||
node_mix.clamp_factor = use_clamp
|
||||
else:
|
||||
node_mix.use_clamp = use_clamp
|
||||
if clamp_result is not None:
|
||||
if bpy.app.version >= (3, 4):
|
||||
node_mix.clamp_result = clamp_result
|
||||
if blend_type is not None:
|
||||
node_mix.blend_type = blend_type
|
||||
if blend_factor is not None:
|
||||
set_value(
|
||||
node=node_mix,
|
||||
sock_name="Factor",
|
||||
sock_bl_idname_expected="NodeSocketFloatFactor",
|
||||
value=blend_factor)
|
||||
return node_mix
|
||||
|
||||
|
||||
def create_mix_shader_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Mix Shader' node."""
|
||||
|
||||
node_mix_shader = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeMixShader",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
return node_mix_shader
|
||||
|
||||
|
||||
def create_mosaic_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
scale: float = 1.0,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Poliigon Mosaic' node group."""
|
||||
|
||||
node_group_mosaic = load_poliigon_node_group(
|
||||
"Mosaic_UV_Mapping")
|
||||
|
||||
if name is None:
|
||||
name = node_group_mosaic.name
|
||||
|
||||
node_mosaic = create_group_node(
|
||||
group=group,
|
||||
parent=parent,
|
||||
node_tree=node_group_mosaic,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
set_value(
|
||||
node=node_mosaic,
|
||||
sock_name="Scale",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=scale)
|
||||
return node_mosaic
|
||||
|
||||
|
||||
def create_normal_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
space: Optional[str] = "TANGENT",
|
||||
strength: Optional[float] = None,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Normal' node."""
|
||||
|
||||
node_normal = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeNormalMap",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if space is not None:
|
||||
node_normal.space = space
|
||||
if strength is not None:
|
||||
set_value(
|
||||
node=node_normal,
|
||||
sock_name="Strength",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=strength)
|
||||
return node_normal
|
||||
|
||||
|
||||
def create_texture_coordinate_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Texture Coordinate' node."""
|
||||
|
||||
node_tex_coord = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeTexCoord",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
return node_tex_coord
|
||||
|
||||
|
||||
def create_transparent_bsdf_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Transparent BSDF' node."""
|
||||
|
||||
node_transparent_bsdf = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeBsdfTransparent",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
return node_transparent_bsdf
|
||||
|
||||
|
||||
def create_translucent_bsdf_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Translucent BSDF' node."""
|
||||
|
||||
node_translucent_bsdf = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeBsdfTranslucent",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
return node_translucent_bsdf
|
||||
|
||||
|
||||
def create_value_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
value: Optional[float] = 0.0,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Value' node."""
|
||||
|
||||
node_value = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeValue",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if value is not None:
|
||||
# Note: This is an output being set.
|
||||
# set_value() is only for inputs!
|
||||
node_value.outputs[0].default_value = value
|
||||
return node_value
|
||||
|
||||
|
||||
def create_vector_math_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
operation: str = "MULTIPLY",
|
||||
value1: Optional[mathutils.Vector] = None,
|
||||
value2: Optional[mathutils.Vector] = None,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Vector Math' node."""
|
||||
|
||||
node_vector_math = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeVectorMath",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if operation is not None:
|
||||
node_vector_math.operation = operation
|
||||
|
||||
if value1 is not None:
|
||||
set_value(
|
||||
node=node_vector_math,
|
||||
sock_name="A",
|
||||
sock_bl_idname_expected="NodeSocketVector",
|
||||
value=value1,
|
||||
# Vector Math node's input ports have identical names
|
||||
allow_index=True)
|
||||
if value2 is not None:
|
||||
set_value(
|
||||
node=node_vector_math,
|
||||
sock_name="B",
|
||||
sock_bl_idname_expected="NodeSocketVector",
|
||||
value=value2,
|
||||
# Vector Math node's input ports have identical names
|
||||
allow_index=True)
|
||||
return node_vector_math
|
||||
|
||||
|
||||
def create_vector_rotate_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
angle_rad: Optional[mathutils.Vector] = None,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Vector Rotate' node."""
|
||||
|
||||
node_vector_rotate = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeVectorRotate",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if angle_rad is not None:
|
||||
set_value(
|
||||
node=node_vector_rotate,
|
||||
sock_name="Angle",
|
||||
sock_bl_idname_expected="NodeSocketVector",
|
||||
value=angle_rad)
|
||||
return node_vector_rotate
|
||||
|
||||
|
||||
def create_volume_absorption_node(
|
||||
group: bpy.types.Node,
|
||||
parent: Optional[bpy.types.Node] = None,
|
||||
*,
|
||||
density: Optional[float] = 100.0,
|
||||
name: Optional[str] = None,
|
||||
location: Optional[mathutils.Vector] = None,
|
||||
width: Optional[float] = None,
|
||||
height: Optional[float] = None,
|
||||
hide: bool = False
|
||||
) -> bpy.types.Node:
|
||||
"""Creates a 'Volume Absorption' node."""
|
||||
|
||||
node_vol_abs = create_node(
|
||||
group=group,
|
||||
bl_idname="ShaderNodeVolumeAbsorption",
|
||||
parent=parent,
|
||||
name=name,
|
||||
location=location,
|
||||
width=width,
|
||||
height=height,
|
||||
hide=hide
|
||||
)
|
||||
if density is not None:
|
||||
set_value(
|
||||
node=node_vol_abs,
|
||||
sock_name="Density",
|
||||
sock_bl_idname_expected="NodeSocketFloat",
|
||||
value=density)
|
||||
return node_vol_abs
|
||||
@@ -0,0 +1,279 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import json
|
||||
import os
|
||||
from typing import Dict, List, Optional
|
||||
|
||||
import bpy
|
||||
|
||||
from .modules.poliigon_core.assets import AssetData
|
||||
from .modules.poliigon_core.user import UserDownloadPreferences
|
||||
from .material_import_cycles import CyclesMaterial, RENDERER_CYCLES
|
||||
from .material_importer_params import MaterialImportParameters
|
||||
from . import reporting
|
||||
|
||||
|
||||
SUPPORTED_RENDERERS = [RENDERER_CYCLES]
|
||||
|
||||
|
||||
class MaterialImporter():
|
||||
|
||||
def __init__(self, cTB, renderer: str = RENDERER_CYCLES):
|
||||
self.cTB = cTB
|
||||
self.renderer = None
|
||||
self.importer = None
|
||||
self.params = None
|
||||
self.asset_data = None
|
||||
|
||||
self.set_renderer(renderer)
|
||||
|
||||
def set_renderer(self, renderer: str) -> bool:
|
||||
"""Sets the renderer to import materials for."""
|
||||
|
||||
if renderer not in IMPORTERS:
|
||||
raise RuntimeError(
|
||||
f"Unsupported renderer: {renderer}\n"
|
||||
f"Supported: {IMPORTERS.keys}")
|
||||
self.renderer = renderer
|
||||
self.importer = IMPORTERS[renderer]()
|
||||
|
||||
def reset_asset(self) -> None:
|
||||
self.asset_data = None
|
||||
|
||||
def convert_dict_to_asset_data(
|
||||
self, asset_dict: Dict) -> Optional[AssetData]:
|
||||
"""Converts a P4B asset data dictionary into an addon-core AssetData
|
||||
instance.
|
||||
"""
|
||||
|
||||
asset_id = asset_dict.get("id", -1)
|
||||
if asset_id >= 0:
|
||||
# Backdoor import expects negative ID
|
||||
asset_id *= -1
|
||||
|
||||
asset_name = asset_dict["name"]
|
||||
|
||||
if len(asset_dict["files"]) == 0:
|
||||
raise RuntimeError("Material import for asset without any files")
|
||||
|
||||
# Separate asset files per library directory
|
||||
# (if distributed across multiple)
|
||||
dirs_libraries = self.cTB.get_library_paths()
|
||||
files_per_dir = {}
|
||||
asset_files = asset_dict["files"]
|
||||
for _idx_dir, _dir in enumerate(dirs_libraries):
|
||||
_dir = os.path.normpath(_dir)
|
||||
files_per_dir[_idx_dir] = [
|
||||
_file
|
||||
for _file in asset_files
|
||||
if os.path.normpath(_file).startswith(_dir)
|
||||
]
|
||||
# Get asset's base directory in each library directory
|
||||
dir_asset_per_lib = {}
|
||||
for _idx_dir, _files in files_per_dir.items():
|
||||
try:
|
||||
dir_asset_per_lib[_idx_dir] = os.path.commonpath(_files)
|
||||
except ValueError:
|
||||
pass # deliberately ignored
|
||||
# Build backdoor file list (per library dir)
|
||||
file_list = []
|
||||
for _dir_asset in dir_asset_per_lib.values():
|
||||
file_list_from_dir = self.cTB._asset_index_mat.file_list_from_directory(
|
||||
asset_dir=_dir_asset, ignore_dirs=[])
|
||||
file_list.extend(file_list_from_dir)
|
||||
|
||||
if len(file_list) == 0:
|
||||
# Backdoor imported asset outside of libraries?
|
||||
dir_asset = os.path.commonpath(asset_files)
|
||||
file_list_from_dir = self.cTB._asset_index_mat.file_list_from_directory(
|
||||
asset_dir=dir_asset, ignore_dirs=[])
|
||||
file_list.extend(file_list_from_dir)
|
||||
|
||||
result = self.cTB._asset_index_mat.load_asset_from_list(
|
||||
asset_id=asset_id,
|
||||
asset_name=asset_name,
|
||||
asset_type=asset_dict["type"],
|
||||
size=asset_dict["sizes"][0], # any size will do here
|
||||
lod="", # not in use
|
||||
workflow_expected=asset_dict.get("workflows", ["METALNESS"])[0],
|
||||
file_list_json=json.dumps(file_list),
|
||||
query_string="p4b_mat_import/All Assets",
|
||||
convention=asset_dict.get("api_convention", 0)
|
||||
)
|
||||
if result is False:
|
||||
msg = f"Failed to convert to AssetData: {asset_id}"
|
||||
reporting.capture_message(
|
||||
"build_mat_error_create", msg, "error")
|
||||
self.cTB._asset_index_mat.flush(all_assets=True)
|
||||
return None
|
||||
|
||||
asset_data = self.cTB._asset_index_mat.get_asset(asset_id)
|
||||
|
||||
self.cTB._asset_index_mat.flush(all_assets=True)
|
||||
return asset_data
|
||||
|
||||
def set_parameters(self,
|
||||
reuse_existing: bool,
|
||||
do_apply: bool,
|
||||
workflow: str,
|
||||
lod: str,
|
||||
size: str,
|
||||
size_bg: Optional[str] = None,
|
||||
variant: Optional[str] = None,
|
||||
name_material: Optional[str] = None,
|
||||
name_mesh: Optional[str] = None,
|
||||
ref_objs: List[any] = [],
|
||||
projection: str = "FLAT",
|
||||
use_16bit: bool = True,
|
||||
mode_disp: str = "NORMAL",
|
||||
translate_x: float = 0.0,
|
||||
translate_y: float = 0.0,
|
||||
scale: float = 1.0,
|
||||
global_rotation: float = 0.0,
|
||||
aspect_ratio: float = 1.0,
|
||||
displacement: float = 0.0,
|
||||
keep_unused_tex_nodes: bool = False,
|
||||
map_prefs: Optional[UserDownloadPreferences] = None
|
||||
) -> None:
|
||||
"""Sets the parameterts for a material import."""
|
||||
|
||||
if self.asset_data is None:
|
||||
raise RuntimeError("No asset set!")
|
||||
|
||||
self.params = MaterialImportParameters(
|
||||
asset_data=self.asset_data,
|
||||
reuse_existing=reuse_existing,
|
||||
do_apply=do_apply,
|
||||
workflow=workflow,
|
||||
lod=lod,
|
||||
size=size,
|
||||
size_bg=size_bg,
|
||||
variant=variant,
|
||||
name_material=name_material,
|
||||
name_mesh=name_mesh,
|
||||
ref_objs=ref_objs,
|
||||
projection=projection,
|
||||
use_16bit=use_16bit,
|
||||
mode_disp=mode_disp,
|
||||
translate_x=translate_x,
|
||||
translate_y=translate_y,
|
||||
scale=scale,
|
||||
global_rotation=global_rotation,
|
||||
aspect_ratio=aspect_ratio,
|
||||
displacement=displacement,
|
||||
keep_unused_tex_nodes=keep_unused_tex_nodes,
|
||||
addon_convention=self.cTB.addon_convention,
|
||||
map_prefs=map_prefs
|
||||
)
|
||||
|
||||
def reset_parameters(self) -> None:
|
||||
"""Resets all parameters."""
|
||||
|
||||
self.params = None
|
||||
|
||||
def get_existing_material(self) -> Optional[bpy.types.Material]:
|
||||
"""Returns an already existing material of identical name.
|
||||
|
||||
This is what legacy import in P4B did for Model assets.
|
||||
Texture assets were handeled differently via
|
||||
find_identical_material().
|
||||
"""
|
||||
|
||||
if not self.params.reuse_existing:
|
||||
return None
|
||||
|
||||
name_mat = self.params.name_material
|
||||
if name_mat in bpy.data.materials.keys():
|
||||
return bpy.data.materials[name_mat]
|
||||
|
||||
return None
|
||||
|
||||
def import_material(self,
|
||||
*,
|
||||
asset_data: AssetData,
|
||||
do_apply: bool,
|
||||
workflow: str,
|
||||
size: str,
|
||||
size_bg: Optional[str] = None,
|
||||
lod: Optional[str] = None,
|
||||
variant: Optional[str] = None,
|
||||
name_material: Optional[str] = None,
|
||||
name_mesh: Optional[str] = None,
|
||||
ref_objs: Optional[List[any]] = None,
|
||||
projection: str = "FLAT",
|
||||
use_16bit: bool = True,
|
||||
mode_disp: str = "NORMAL",
|
||||
translate_x: float = 0.0,
|
||||
translate_y: float = 0.0,
|
||||
scale: float = 1.0,
|
||||
global_rotation: float = 0.0,
|
||||
aspect_ratio: float = 1.0,
|
||||
displacement: float = 0.0,
|
||||
keep_unused_tex_nodes: bool = False,
|
||||
reuse_existing: bool = True,
|
||||
map_prefs: Optional[UserDownloadPreferences] = None
|
||||
) -> Optional[bpy.types.Material]:
|
||||
"""Imports a single material for an asset regardless of type."""
|
||||
|
||||
if asset_data is None:
|
||||
return None
|
||||
self.asset_data = asset_data
|
||||
|
||||
self.set_parameters(
|
||||
reuse_existing=reuse_existing,
|
||||
do_apply=do_apply,
|
||||
workflow=workflow,
|
||||
lod=lod,
|
||||
size=size,
|
||||
size_bg=size_bg,
|
||||
variant=variant,
|
||||
name_material=name_material,
|
||||
name_mesh=name_mesh,
|
||||
ref_objs=ref_objs,
|
||||
projection=projection,
|
||||
use_16bit=use_16bit,
|
||||
mode_disp=mode_disp,
|
||||
translate_x=translate_x,
|
||||
translate_y=translate_y,
|
||||
scale=scale,
|
||||
global_rotation=global_rotation,
|
||||
aspect_ratio=aspect_ratio,
|
||||
displacement=displacement,
|
||||
keep_unused_tex_nodes=keep_unused_tex_nodes,
|
||||
map_prefs=map_prefs
|
||||
)
|
||||
|
||||
# Case for Model import, Texture assets handle material re-use still
|
||||
# in operator. TODO(Andreas)
|
||||
mat = self.get_existing_material()
|
||||
if mat is not None:
|
||||
self.reset_parameters()
|
||||
self.reset_asset()
|
||||
return mat
|
||||
|
||||
mat = self.importer.import_material(self.asset_data, self.params)
|
||||
|
||||
self.reset_parameters()
|
||||
self.reset_asset()
|
||||
return mat
|
||||
|
||||
|
||||
IMPORTERS = {
|
||||
RENDERER_CYCLES: CyclesMaterial
|
||||
}
|
||||
@@ -0,0 +1,136 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import List, Optional
|
||||
|
||||
import bpy
|
||||
|
||||
from .modules.poliigon_core.assets import (AssetData,
|
||||
AssetType)
|
||||
from .modules.poliigon_core.user import UserDownloadPreferences
|
||||
|
||||
|
||||
@dataclass
|
||||
class MaterialImportParameters():
|
||||
name_material: str
|
||||
reuse_existing: bool
|
||||
do_apply: bool
|
||||
workflow: str
|
||||
lod: str
|
||||
size: str
|
||||
size_bg: Optional[str]
|
||||
variant: Optional[str]
|
||||
is_preview: bool
|
||||
is_backplate: bool
|
||||
is_model_import: bool
|
||||
name_mesh: str
|
||||
ref_objs: List[any]
|
||||
projection: str
|
||||
use_16bit: bool
|
||||
mode_disp: str
|
||||
translate_x: float
|
||||
translate_y: float
|
||||
scale: float
|
||||
global_rotation: float
|
||||
aspect_ratio: float
|
||||
displacement: float
|
||||
keep_unused_tex_nodes: bool
|
||||
map_prefs: UserDownloadPreferences
|
||||
|
||||
def __init__(self,
|
||||
asset_data: AssetData,
|
||||
reuse_existing: bool,
|
||||
do_apply: bool,
|
||||
workflow: str,
|
||||
lod: str,
|
||||
size: str,
|
||||
size_bg: Optional[str] = None,
|
||||
variant: Optional[str] = None,
|
||||
name_material: Optional[str] = None,
|
||||
name_mesh: Optional[str] = None,
|
||||
ref_objs: List[any] = [],
|
||||
projection: str = "FLAT",
|
||||
use_16bit: bool = True,
|
||||
mode_disp: str = "NORMAL",
|
||||
translate_x: float = 0.0,
|
||||
translate_y: float = 0.0,
|
||||
scale: float = 1.0,
|
||||
global_rotation: float = 0.0,
|
||||
aspect_ratio: float = 1.0,
|
||||
displacement: float = 0.0,
|
||||
keep_unused_tex_nodes: bool = False,
|
||||
addon_convention: int = 0,
|
||||
map_prefs: Optional[UserDownloadPreferences] = None
|
||||
):
|
||||
asset_type_data = asset_data.get_type_data()
|
||||
local_convention = asset_data.get_convention(local=True)
|
||||
|
||||
is_preview = size == "WM"
|
||||
is_backplate = asset_data.is_backplate()
|
||||
is_model_import = asset_data.asset_type == AssetType.MODEL
|
||||
if name_material is None:
|
||||
name_material = asset_data.get_material_name(size, variant)
|
||||
|
||||
# Validate size and get closest locally available
|
||||
if size == "PREVIEW":
|
||||
size = "WM"
|
||||
|
||||
size = asset_type_data.get_size(
|
||||
size=size,
|
||||
incl_watermarked=is_preview,
|
||||
local_only=True,
|
||||
addon_convention=addon_convention,
|
||||
local_convention=local_convention
|
||||
)
|
||||
# Validate workflow or get locally available
|
||||
workflow = asset_type_data.get_workflow(
|
||||
workflow=workflow,
|
||||
get_local=True
|
||||
)
|
||||
# Restrict displacement/normal mode based on render engine
|
||||
if bpy.context.scene.render.engine == "BLENDER_EEVEE":
|
||||
mode_disp = "NORMAL"
|
||||
|
||||
if local_convention < 1:
|
||||
map_prefs = None
|
||||
|
||||
self.name_material = name_material
|
||||
self.reuse_existing = reuse_existing
|
||||
self.do_apply = do_apply
|
||||
self.workflow = workflow
|
||||
self.lod = lod
|
||||
self.size = size
|
||||
self.size_bg = size_bg
|
||||
self.variant = variant
|
||||
self.is_preview = is_preview
|
||||
self.is_model_import = is_model_import
|
||||
self.is_backplate = is_backplate
|
||||
self.name_mesh = name_mesh
|
||||
self.ref_objs = ref_objs
|
||||
self.projection = projection
|
||||
self.use_16bit = use_16bit
|
||||
self.mode_disp = mode_disp
|
||||
self.translate_x = translate_x
|
||||
self.translate_y = translate_y
|
||||
self.scale = scale
|
||||
self.global_rotation = global_rotation
|
||||
self.aspect_ratio = aspect_ratio
|
||||
self.displacement = displacement
|
||||
self.keep_unused_tex_nodes = keep_unused_tex_nodes
|
||||
self.map_prefs = map_prefs
|
||||
@@ -0,0 +1,4 @@
|
||||
[flake8]
|
||||
extend-ignore = E501
|
||||
exclude =
|
||||
__pycache__
|
||||
@@ -0,0 +1,815 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from concurrent.futures import Future
|
||||
from datetime import datetime
|
||||
from functools import lru_cache
|
||||
from typing import Any, Callable, Dict, List, Optional, Tuple
|
||||
from errno import EACCES, ENOSPC
|
||||
import functools
|
||||
import os
|
||||
import json
|
||||
import webbrowser
|
||||
|
||||
from .assets import (AssetType,
|
||||
AssetData,
|
||||
ModelType,
|
||||
SIZES)
|
||||
from .user import (PoliigonUser,
|
||||
PoliigonSubscription)
|
||||
|
||||
from .plan_manager import SubscriptionState, PoliigonPlanUpgradeManager
|
||||
|
||||
from . import api
|
||||
from . import asset_index
|
||||
from . import env
|
||||
from .logger import (DEBUG, # noqa F401, allowing downstream const usage
|
||||
ERROR,
|
||||
INFO,
|
||||
get_addon_logger,
|
||||
NOT_SET,
|
||||
WARNING)
|
||||
from .notifications import NotificationSystem
|
||||
from . import settings
|
||||
from . import updater
|
||||
from .multilingual import Multilingual
|
||||
from . import thread_manager as tm
|
||||
|
||||
|
||||
DIR_PATH = os.path.dirname(os.path.abspath(__file__))
|
||||
RESOURCES_PATH = os.path.join(DIR_PATH, "resources")
|
||||
|
||||
|
||||
class PoliigonAddon():
|
||||
"""Poliigon addon used for creating base singleton in DCC applications."""
|
||||
|
||||
addon_name: str # e.g. poliigon-addon-blender
|
||||
addon_version: tuple # Current addon version
|
||||
software_source: str # e.g. blender
|
||||
software_version: tuple # DCC software version, e.g. (3, 0)
|
||||
addon_convention: int # Maximum convention supported by DCC implementation
|
||||
|
||||
library_paths: List = []
|
||||
|
||||
def __init__(self,
|
||||
addon_name: str,
|
||||
addon_version: tuple,
|
||||
software_source: str,
|
||||
software_version: tuple,
|
||||
addon_env: env.PoliigonEnvironment,
|
||||
addon_settings: settings.PoliigonSettings,
|
||||
addon_convention: int,
|
||||
addon_supported_model: List[ModelType] = [ModelType.FBX],
|
||||
language: str = "en-US",
|
||||
# See ThreadManager.__init__ for signature below,
|
||||
# e.g. print_exc(fut: Future, key_pool: PoolKeys)
|
||||
callback_print_exc: Optional[Callable] = None):
|
||||
self.log_manager = get_addon_logger(env=addon_env)
|
||||
|
||||
if addon_env.env_name == "prod":
|
||||
have_filehandler = False
|
||||
else:
|
||||
have_filehandler = True
|
||||
self.logger = self.log_manager.initialize_logger(
|
||||
have_filehandler=have_filehandler)
|
||||
self.logger_api = self.log_manager.initialize_logger(
|
||||
"API", have_filehandler=have_filehandler)
|
||||
self.logger_dl = self.log_manager.initialize_logger(
|
||||
"DL", have_filehandler=have_filehandler)
|
||||
|
||||
self.language = language
|
||||
|
||||
self.multilingual = Multilingual()
|
||||
self.multilingual.install_domain(language=self.language,
|
||||
dir_lang=os.path.join(RESOURCES_PATH, "lang"),
|
||||
domain="addon-core")
|
||||
|
||||
self.addon_name = addon_name
|
||||
self.addon_version = addon_version
|
||||
self.software_source = software_source
|
||||
self.software_version = software_version
|
||||
self.addon_convention = addon_convention
|
||||
|
||||
self.user = None
|
||||
self.login_error = None
|
||||
self.api_rc = None # To be set on the DCC side
|
||||
|
||||
self.upgrade_manager = PoliigonPlanUpgradeManager(self)
|
||||
|
||||
self._env = addon_env
|
||||
|
||||
self.set_logger_verbose(verbose=False)
|
||||
|
||||
self._settings = addon_settings
|
||||
self._api = api.PoliigonConnector(
|
||||
env=self._env,
|
||||
software=software_source,
|
||||
logger=self.logger_api
|
||||
)
|
||||
self.logger.debug(f"API URL V1: {self._api.api_url}")
|
||||
self.logger.debug(f"API URL V2: {self._api.api_url_v2}")
|
||||
if "v1" in self._api.api_url and "apiv1" not in self._api.api_url:
|
||||
self.logger.warning("Likely you are running with an outdated API V1 URL")
|
||||
self._api.register_update(
|
||||
".".join([str(x) for x in addon_version]),
|
||||
".".join([str(x) for x in software_version])
|
||||
)
|
||||
self._tm = tm.ThreadManager(callback_print_exc=callback_print_exc)
|
||||
self.notify = NotificationSystem(self)
|
||||
self._api.notification_system = self.notify
|
||||
self._updater = updater.SoftwareUpdater(
|
||||
addon_name=addon_name,
|
||||
addon_version=addon_version,
|
||||
software_version=software_version,
|
||||
notification_system=self.notify,
|
||||
local_json=self._env.local_updater_json
|
||||
)
|
||||
|
||||
self.settings_config = self._settings.config
|
||||
|
||||
self.user_addon_dir = os.path.join(
|
||||
os.path.expanduser("~"),
|
||||
"Poliigon"
|
||||
)
|
||||
|
||||
self.setup_libraries()
|
||||
self.categories_path = os.path.join(self.user_addon_dir, "categories.json")
|
||||
|
||||
default_asset_index_path = os.path.join(
|
||||
self.user_addon_dir,
|
||||
"AssetIndex",
|
||||
"asset_index.json",
|
||||
)
|
||||
self._asset_index = asset_index.AssetIndex(
|
||||
addon=self,
|
||||
addon_convention=addon_convention,
|
||||
path_cache=default_asset_index_path,
|
||||
addon_supported_model=addon_supported_model,
|
||||
log=None
|
||||
)
|
||||
self.online_previews_path = self.setup_temp_previews_folder()
|
||||
|
||||
# TODO(Andreas): Could well be done in constructor itself.
|
||||
# Yet, it would break DCC implementations, atm.
|
||||
def init_addon_parameters(
|
||||
self,
|
||||
*,
|
||||
get_optin: Callable,
|
||||
callback_on_invalidated_token: Callable,
|
||||
report_message: Callable,
|
||||
report_exception: Callable,
|
||||
report_thread: Callable,
|
||||
status_listener: Callable,
|
||||
urls_dcc: Dict[str, str],
|
||||
notify_icon_info: Any,
|
||||
notify_icon_no_connection: Any,
|
||||
notify_icon_survey: Any,
|
||||
notify_icon_warn: Any,
|
||||
notify_update_body: str
|
||||
# TODO(Andreas): Once API RC gets instanced here, add:
|
||||
# page_size_online_assets: int,
|
||||
# page_size_my_assets: int,
|
||||
# callback_get_categories_done: Callable,
|
||||
# callback_get_asset_done: Callable,
|
||||
# callback_get_user_data_done: Callable,
|
||||
# callback_get_download_prefs_done: Callable
|
||||
) -> None:
|
||||
"""Initializes all parameters of PoliigonAddon."""
|
||||
|
||||
self._api.get_optin = get_optin
|
||||
self._api.set_on_invalidated(callback_on_invalidated_token)
|
||||
self._api._status_listener = status_listener
|
||||
self._api.add_poliigon_urls(urls_dcc)
|
||||
self._api._report_message = report_message
|
||||
self._api._report_exception = report_exception
|
||||
|
||||
self._tm.reporting_callable = report_thread
|
||||
|
||||
self.notify.init_icons(
|
||||
icon_info=notify_icon_info,
|
||||
icon_no_connection=notify_icon_no_connection,
|
||||
icon_survey=notify_icon_survey,
|
||||
icon_warn=notify_icon_warn)
|
||||
self.notify.addon_params.update_body = notify_update_body
|
||||
|
||||
# TODO(Andreas): Once API RC gets instanced in constructor,
|
||||
# add the following here:
|
||||
# params = self.api_rc._addon_params
|
||||
# params.online_assets_chunk_size = page_size_online_assets
|
||||
# params.my_assets_chunk_size = page_size_my_assets
|
||||
# params.callback_get_categories_done = callback_get_categories_done
|
||||
# params.callback_get_asset_done = callback_get_asset_done
|
||||
# params.callback_get_user_data_done = callback_get_user_data_done
|
||||
# params.callback_get_download_prefs_done = callback_get_download_prefs_done
|
||||
|
||||
# Decorator copied from comment in thread_manager.py
|
||||
def run_threaded(key_pool: tm.PoolKeys,
|
||||
max_threads: Optional[int] = None,
|
||||
foreground: bool = False) -> Callable:
|
||||
"""Schedule a function to run in a thread of a chosen pool"""
|
||||
def wrapped_func(func: Callable) -> Callable:
|
||||
@functools.wraps(func)
|
||||
def wrapped_func_call(self, *args, **kwargs):
|
||||
args = (self, ) + args
|
||||
return self._tm.queue_thread(func, key_pool,
|
||||
max_threads, foreground,
|
||||
*args, **kwargs)
|
||||
return wrapped_func_call
|
||||
return wrapped_func
|
||||
|
||||
def setup_libraries(self):
|
||||
default_lib_path = os.path.join(self.user_addon_dir, "Library")
|
||||
multi_dir = self.settings_config["directories"]
|
||||
|
||||
primary_lib_path = self.settings_config.get(
|
||||
"library", "primary", fallback=None)
|
||||
|
||||
# If primary lib is not found in settings, set the default path as
|
||||
# primary, the DCC side should handle the value missing in settings
|
||||
# (e.g. choose main lib screen)
|
||||
if primary_lib_path not in [None, ""]:
|
||||
self.library_paths.append(primary_lib_path)
|
||||
else:
|
||||
self.library_paths.append(default_lib_path)
|
||||
|
||||
for dir_idx in multi_dir.keys():
|
||||
path = self.settings_config.get("directories", str(dir_idx))
|
||||
self.library_paths.append(path)
|
||||
|
||||
# TODO(Andreas): why is it called temp?
|
||||
def setup_temp_previews_folder(self) -> str:
|
||||
previews_dir = os.path.join(self.user_addon_dir, "OnlinePreviews")
|
||||
try:
|
||||
os.makedirs(previews_dir, exist_ok=True)
|
||||
except Exception:
|
||||
self.logger.exception(
|
||||
f"Failed to create directory: {previews_dir}")
|
||||
|
||||
# Removing lock temp files for thumbs
|
||||
for _file in os.listdir(previews_dir):
|
||||
file_path = os.path.join(previews_dir, _file)
|
||||
if os.path.isfile(file_path) and _file.endswith("_temp"):
|
||||
os.remove(file_path)
|
||||
return previews_dir
|
||||
|
||||
def load_categories_from_disk(self) -> Optional[Dict]:
|
||||
"""Loads categories from disk."""
|
||||
|
||||
if not os.path.exists(self.categories_path):
|
||||
return None
|
||||
|
||||
try:
|
||||
with open(self.categories_path, "r") as file_categories:
|
||||
category_json = json.load(file_categories)
|
||||
if not isinstance(category_json, List):
|
||||
return None
|
||||
|
||||
# TODO(Andreas): error handling
|
||||
# Whatever error we encounter, worst outcome is no cached categories
|
||||
except OSError as e:
|
||||
if e.errno == EACCES:
|
||||
return None
|
||||
else:
|
||||
return None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
return category_json
|
||||
|
||||
def save_categories_to_disk(self, category_json: List) -> None:
|
||||
"""Stores categories (as received from API) to disk."""
|
||||
|
||||
try:
|
||||
with open(self.categories_path, "w") as file_categories:
|
||||
json.dump(category_json, file_categories, indent=4)
|
||||
# TODO(Andreas): error handling
|
||||
# Whatever error we encounter, worst outcome is no cached categories
|
||||
except OSError as e:
|
||||
if e.errno == ENOSPC:
|
||||
return
|
||||
elif e.errno == EACCES:
|
||||
return
|
||||
else:
|
||||
return
|
||||
except Exception:
|
||||
return
|
||||
|
||||
def set_logger_verbose(self, verbose: bool) -> None:
|
||||
"""To be used by DCC side to set main logger verbosity."""
|
||||
|
||||
log_lvl_from_env = NOT_SET
|
||||
if self._env.config is not None:
|
||||
log_lvl_from_env = self._env.config.getint(
|
||||
"DEFAULT", "log_lvl", fallback=NOT_SET)
|
||||
if log_lvl_from_env != NOT_SET:
|
||||
self.logger.info(f"Log level forced by env: {log_lvl_from_env}")
|
||||
return
|
||||
log_lvl = INFO if verbose else ERROR
|
||||
self.logger.setLevel(log_lvl)
|
||||
|
||||
def is_logged_in(self) -> bool:
|
||||
"""Returns whether or not the user is currently logged in."""
|
||||
return self._api.token is not None and not self._api.invalidated
|
||||
|
||||
def is_user_invalidated(self) -> bool:
|
||||
"""Returns whether or not the user token was invalidated."""
|
||||
return self._api.invalidated
|
||||
|
||||
def clear_user_invalidated(self):
|
||||
"""Clears any invalidation flag for a user."""
|
||||
self._api.invalidated = False
|
||||
|
||||
@run_threaded(tm.PoolKeys.INTERACTIVE)
|
||||
def log_in_with_credentials(self,
|
||||
email: str,
|
||||
password: str,
|
||||
*,
|
||||
wait_for_user: bool = False) -> Future:
|
||||
self.clear_user_invalidated()
|
||||
|
||||
req = self._api.log_in(
|
||||
email,
|
||||
password
|
||||
)
|
||||
|
||||
if req.ok:
|
||||
user_data = req.body.get("user", {})
|
||||
|
||||
fut = self.create_user(user_data.get("name"), user_data.get("id"))
|
||||
if wait_for_user:
|
||||
fut.result(timeout=api.TIMEOUT)
|
||||
|
||||
self.login_error = None
|
||||
else:
|
||||
self.login_error = req.error
|
||||
|
||||
return req
|
||||
|
||||
def log_in_with_website(self):
|
||||
pass
|
||||
|
||||
def check_for_survey_notice(
|
||||
self,
|
||||
free_user_url: str,
|
||||
plan_user_url: str,
|
||||
interval: int,
|
||||
label: str,
|
||||
tooltip: str = "",
|
||||
auto_enqueue: bool = True) -> None:
|
||||
|
||||
already_shown = self.settings_config.get(
|
||||
"user", "survey_notice_shown", fallback=None)
|
||||
|
||||
if already_shown not in [None, ""]:
|
||||
# Never notify again if already did once
|
||||
return
|
||||
|
||||
first_local_asset = self.settings_config.get(
|
||||
"user", "first_local_asset", fallback=None)
|
||||
|
||||
if first_local_asset in ["", None]:
|
||||
return
|
||||
|
||||
def set_user_survey_flag() -> None:
|
||||
self.settings_config.set(
|
||||
"user", "survey_notice_shown", str(datetime.now()))
|
||||
self._settings.save_settings()
|
||||
|
||||
first_asset_dl = datetime.strptime(first_local_asset, "%Y-%m-%d %H:%M:%S.%f")
|
||||
difference = datetime.now() - first_asset_dl
|
||||
if difference.days >= interval:
|
||||
self.notify.create_survey(
|
||||
is_free_user=self.is_free_user(),
|
||||
tooltip=tooltip,
|
||||
free_survey_url=free_user_url,
|
||||
active_survey_url=plan_user_url,
|
||||
label=label,
|
||||
auto_enqueue=auto_enqueue,
|
||||
on_dismiss_callable=set_user_survey_flag
|
||||
)
|
||||
|
||||
@run_threaded(tm.PoolKeys.INTERACTIVE)
|
||||
def log_out(self):
|
||||
req = self._api.log_out()
|
||||
if req.ok:
|
||||
print("Logout success")
|
||||
else:
|
||||
print(req.error)
|
||||
|
||||
self._api.token = None
|
||||
|
||||
# Clear out user on logout.
|
||||
self.user = None
|
||||
|
||||
def add_library_path(self,
|
||||
path: str,
|
||||
primary: bool = True,
|
||||
update_local_assets: bool = True
|
||||
) -> None:
|
||||
if not os.path.isdir(path):
|
||||
self.logger.info(f"Library Path to be added is not a directory: {path}")
|
||||
return
|
||||
elif path in self.library_paths:
|
||||
if primary:
|
||||
self.remove_library_path(path)
|
||||
else:
|
||||
self.logger.info(f"Library Path to be added is already in the list: {path}")
|
||||
return
|
||||
|
||||
if primary:
|
||||
if len(self.library_paths) == 0:
|
||||
self.library_paths = [path]
|
||||
else:
|
||||
self.library_paths[0] = path
|
||||
self.settings_config.set("library", "primary", path)
|
||||
else:
|
||||
self.library_paths.append(path)
|
||||
idx = 0
|
||||
list_directory_idxs = list(self.settings_config["directories"].keys())
|
||||
if len(list_directory_idxs) > 0:
|
||||
idx = int(list_directory_idxs[-1]) + 1
|
||||
self.settings_config.set("directories", str(idx), path)
|
||||
|
||||
self._settings.save_settings()
|
||||
|
||||
if update_local_assets:
|
||||
self._asset_index.update_all_local_assets(self.library_paths)
|
||||
|
||||
def remove_library_path(self,
|
||||
path: str,
|
||||
update_local_assets: bool = True
|
||||
) -> None:
|
||||
if path not in self.library_paths:
|
||||
self.logger.info(f"Library Path to be removed is not in the list: {path}")
|
||||
return
|
||||
|
||||
self.library_paths.remove(path)
|
||||
|
||||
for dir_idx in self.settings_config["directories"].keys():
|
||||
dir_path = self.settings_config.get("directories", dir_idx)
|
||||
if dir_path == path:
|
||||
self.settings_config.remove_option("directories", dir_idx)
|
||||
self._settings.save_settings()
|
||||
|
||||
if update_local_assets:
|
||||
self._asset_index.flush_is_local()
|
||||
self._asset_index.update_all_local_assets(self.library_paths)
|
||||
|
||||
def replace_library_path(self,
|
||||
path_old: str,
|
||||
path_new: str,
|
||||
primary: bool = True,
|
||||
update_local_assets: bool = True
|
||||
) -> None:
|
||||
self.remove_library_path(path_old, update_local_assets=False)
|
||||
self._asset_index.flush_is_local()
|
||||
self.add_library_path(path_new,
|
||||
primary=primary,
|
||||
update_local_assets=update_local_assets)
|
||||
|
||||
def get_library_paths(self):
|
||||
return self.library_paths
|
||||
|
||||
def get_library_path(self, primary: bool = True):
|
||||
if self.library_paths and primary:
|
||||
return self.library_paths[0]
|
||||
elif len(self.library_paths) > 1:
|
||||
# TODO(Mitchell): Return the most relevant lib path based on some input (?)
|
||||
return None
|
||||
else:
|
||||
return None
|
||||
|
||||
def _get_user_info(self) -> Tuple:
|
||||
req = self._api.get_user_info()
|
||||
user_name = None
|
||||
user_id = None
|
||||
|
||||
if req.ok:
|
||||
data = req.body
|
||||
user_name = data["user"]["name"]
|
||||
user_id = data["user"]["id"]
|
||||
self.login_error = None
|
||||
else:
|
||||
# TODO(SOFT-1029): Create an error log for fail in get user info
|
||||
self.login_error = req.error
|
||||
|
||||
return user_name, user_id
|
||||
|
||||
def _get_credits(self):
|
||||
if self.user is None:
|
||||
msg = "_get_credits() called without user."
|
||||
self._api.report_message(
|
||||
"addon_get_credits", msg, "error")
|
||||
return
|
||||
|
||||
req = self._api.get_user_balance()
|
||||
if req.ok:
|
||||
data = req.body
|
||||
self.user.credits = data.get("subscription_balance")
|
||||
self.user.credits_od = data.get("ondemand_balance")
|
||||
else:
|
||||
self.user.credits = None
|
||||
self.user.credits_od = None
|
||||
msg = f"ERROR: {req.error}"
|
||||
self._api.report_message(
|
||||
"addon_get_credits", msg, "error")
|
||||
|
||||
def _get_subscription_details(self):
|
||||
"""Fetches the current user's subscription status."""
|
||||
req = self._api.get_subscription_details()
|
||||
|
||||
if req.ok:
|
||||
plan = req.body
|
||||
self.user.plan.update_from_dict(plan)
|
||||
|
||||
@run_threaded(tm.PoolKeys.INTERACTIVE)
|
||||
def update_plan_data(self, done_callback: Optional[Callable] = None) -> None:
|
||||
# TODO(Joao): sub thread the two private functions
|
||||
self._get_credits()
|
||||
self._get_subscription_details()
|
||||
if done_callback is not None:
|
||||
done_callback()
|
||||
|
||||
def create_user(
|
||||
self,
|
||||
user_name: Optional[str] = None,
|
||||
user_id: Optional[int] = None,
|
||||
done_callback: Optional[Callable] = None) -> Optional[Future]:
|
||||
|
||||
if user_name is None or user_id is None:
|
||||
user_name, user_id = self._get_user_info()
|
||||
|
||||
if user_name is None or user_id is None:
|
||||
return None
|
||||
|
||||
self.user = PoliigonUser(
|
||||
user_name=user_name,
|
||||
user_id=user_id,
|
||||
plan=PoliigonSubscription(
|
||||
subscription_state=SubscriptionState.NOT_POPULATED)
|
||||
)
|
||||
|
||||
future = self.update_plan_data(done_callback)
|
||||
return future
|
||||
|
||||
def is_free_user(self) -> bool:
|
||||
"""Identifies a free user which neither
|
||||
has a plan nor on demand credits."""
|
||||
|
||||
if self.user is None:
|
||||
# Should not happen in practice with a Poliigon addon
|
||||
return False
|
||||
|
||||
sub_state = self.user.plan.subscription_state
|
||||
free_plan = sub_state == SubscriptionState.FREE
|
||||
no_credits = self.user.credits in [0, None]
|
||||
no_od_credits = self.user.credits_od in [0, None]
|
||||
|
||||
return free_plan and no_credits and no_od_credits
|
||||
|
||||
def is_unlimited_user(self) -> bool:
|
||||
if self.user is None:
|
||||
return False
|
||||
elif self.user.plan in [None, SubscriptionState.NOT_POPULATED]:
|
||||
return False
|
||||
elif self.user.plan.is_unlimited is None:
|
||||
return False
|
||||
return self.user.plan.is_unlimited
|
||||
|
||||
def is_paused_subscription(self) -> Optional[bool]:
|
||||
"""Return True, if the Subscription is in paused state.
|
||||
|
||||
Return value may be None, if there is no plan.
|
||||
"""
|
||||
|
||||
if self.user is None or self.user.plan is None:
|
||||
return None
|
||||
return self.user.plan.subscription_state == SubscriptionState.PAUSED
|
||||
|
||||
def get_user_credits(self, incl_od: bool = True) -> int:
|
||||
"""Returns the number of _spendable_ credits."""
|
||||
|
||||
subscr_paused = self.is_paused_subscription()
|
||||
|
||||
credits = self.user.credits
|
||||
credits_od = self.user.credits_od
|
||||
|
||||
if not incl_od and credits_od is not None:
|
||||
credits_od = 0
|
||||
|
||||
if credits is None and credits_od is None:
|
||||
return 0
|
||||
elif credits_od is None:
|
||||
return credits if not subscr_paused else 0
|
||||
elif credits is None:
|
||||
return credits_od
|
||||
else:
|
||||
if subscr_paused:
|
||||
return credits_od
|
||||
else:
|
||||
return credits + credits_od
|
||||
|
||||
def get_thumbnail_path(self, asset_name, index):
|
||||
"""Return the best fitting thumbnail preview for an asset.
|
||||
|
||||
The primary grid UI preview will be named asset_preview1.png,
|
||||
all others will be named such as asset_preview1_1K.png
|
||||
"""
|
||||
if index == 0:
|
||||
# 0 is the small grid preview version of _preview1.
|
||||
|
||||
# Fallback to legacy option of .jpg files if .png not found.
|
||||
thumb = os.path.join(
|
||||
self.online_previews_path,
|
||||
asset_name + "_preview1.png"
|
||||
)
|
||||
if not os.path.exists(thumb):
|
||||
thumb = os.path.join(
|
||||
self.online_previews_path,
|
||||
asset_name + "_preview1.jpg"
|
||||
)
|
||||
else:
|
||||
thumb = os.path.join(
|
||||
self.online_previews_path,
|
||||
asset_name + f"_preview{index}_1K.png")
|
||||
return thumb
|
||||
|
||||
def get_type_default_size(self, asset_data: AssetData) -> Optional[str]:
|
||||
"""Returns a list of sizes valid for download."""
|
||||
|
||||
type_data = asset_data.get_type_data()
|
||||
sizes_data = type_data.get_size_list()
|
||||
|
||||
size = None
|
||||
if asset_data.asset_type == AssetType.TEXTURE:
|
||||
size = self.settings_config.get("download", "tex_res")
|
||||
elif asset_data.asset_type == AssetType.MODEL:
|
||||
settings_size = self.settings_config.get(
|
||||
"download", "model_res")
|
||||
size_default = asset_data.model.size_default
|
||||
has_default = size_default is not None
|
||||
if settings_size in ["", "NONE", None] and has_default:
|
||||
size = size_default
|
||||
else:
|
||||
size = settings_size
|
||||
elif asset_data.asset_type == AssetType.HDRI:
|
||||
size = self.settings_config.get("download", "hdri_light")
|
||||
# TODO(Andreas): what about bg size?
|
||||
elif asset_data.asset_type == AssetType.BRUSH:
|
||||
size = self.settings_config.get("download", "brush")
|
||||
|
||||
valid_size = size in sizes_data
|
||||
|
||||
# If no valid size found, try to find at least one matching asset's
|
||||
# available size data
|
||||
if not valid_size:
|
||||
for _size in reversed(SIZES):
|
||||
if _size in sizes_data:
|
||||
size = _size
|
||||
break
|
||||
|
||||
return size
|
||||
|
||||
def set_first_local_asset(self, force_update: bool = False) -> None:
|
||||
"""Conditionally assigns the current date to the settings file.
|
||||
|
||||
Meant to be used in conjunction with surveying, this should be called
|
||||
either on first download or first import, if the value hasn't already
|
||||
been set or if force_update is true."""
|
||||
|
||||
first_asset_timestamp = self.settings_config.get(
|
||||
"user", "first_local_asset", fallback="")
|
||||
if first_asset_timestamp == "" or force_update:
|
||||
time_stamp = datetime.now()
|
||||
self.settings_config.set(
|
||||
"user", "first_local_asset", str(time_stamp))
|
||||
self._settings.save_settings()
|
||||
|
||||
def set_first_preview_import(self, force_update: bool = False) -> None:
|
||||
first_wm_timestamp = self.settings_config.get(
|
||||
"user", "first_preview_import", fallback="")
|
||||
if first_wm_timestamp == "" or force_update:
|
||||
time_stamp = datetime.now()
|
||||
self.settings_config.set(
|
||||
"user", "first_preview_import", str(time_stamp))
|
||||
self._settings.save_settings()
|
||||
|
||||
def set_first_purchase(self, force_update: bool = False) -> None:
|
||||
first_purchase_timestamp = self.settings_config.get(
|
||||
"user", "first_purchase", fallback="")
|
||||
if first_purchase_timestamp == "" or force_update:
|
||||
time_stamp = datetime.now()
|
||||
self.settings_config.set(
|
||||
"user", "first_purchase", str(time_stamp))
|
||||
self._settings.save_settings()
|
||||
|
||||
def print_debug(self, *args, dbg=False, bg=True):
|
||||
"""Print out a debug statement with no separator line.
|
||||
|
||||
Cache based on args up to a limit, to avoid excessive repeat prints.
|
||||
All args must be flat values, such as already casted to strings, else
|
||||
an error will be thrown.
|
||||
"""
|
||||
if dbg:
|
||||
# Ensure all inputs are hashable, otherwise lru_cache fails.
|
||||
stringified = [str(arg) for arg in args]
|
||||
self._cached_print(*stringified, bg=bg)
|
||||
|
||||
@lru_cache(maxsize=32)
|
||||
def _cached_print(self, *args, bg: bool):
|
||||
"""A safe-to-cache function for printing."""
|
||||
print(*args)
|
||||
|
||||
def open_asset_url(self, asset_id: int) -> None:
|
||||
asset_data = self._asset_index.get_asset(asset_id)
|
||||
url = self._api.add_utm_suffix(asset_data.url)
|
||||
webbrowser.open(url)
|
||||
|
||||
def open_poliigon_link(self,
|
||||
link_type: str,
|
||||
add_utm_suffix: bool = True
|
||||
) -> None:
|
||||
"""Opens a Poliigon URL"""
|
||||
|
||||
# TODO(Andreas): As soon as P4B uses PoliigonAddon move code from
|
||||
# api.open_poliigon_link here and remove function in api
|
||||
self._api.open_poliigon_link(
|
||||
link_type, add_utm_suffix, env_name=self._env.env_name)
|
||||
|
||||
def get_wm_download_path(self, asset_name: str) -> str:
|
||||
"""Returns an asset name path inside the OnlinePreviews folder"""
|
||||
|
||||
path_poliigon = os.path.dirname(self._settings.base)
|
||||
path_thumbs = os.path.join(path_poliigon, "OnlinePreviews")
|
||||
path_wm_previews = os.path.join(path_thumbs, asset_name)
|
||||
return path_wm_previews
|
||||
|
||||
def download_material_wm(
|
||||
self, files_to_download: List[Tuple[str, str]]) -> None:
|
||||
"""Synchronous function to download material preview."""
|
||||
|
||||
urls = []
|
||||
files_dl = []
|
||||
for _url_wm, _filename_wm_dl in files_to_download:
|
||||
urls.append(_url_wm)
|
||||
files_dl.append(_filename_wm_dl)
|
||||
|
||||
resp = self._api.pooled_preview_download(urls, files_dl)
|
||||
if not resp.ok:
|
||||
msg = f"Failed to download WM preview\n{resp}"
|
||||
self._api.report_message(
|
||||
"download_mat_preview_dl_failed", msg, "error")
|
||||
# Continue, as some may have worked.
|
||||
|
||||
for _filename_wm_dl in files_dl:
|
||||
filename_wm = _filename_wm_dl[:-3] # cut of _dl
|
||||
|
||||
try:
|
||||
file_exists = os.path.exists(filename_wm)
|
||||
dl_exists = os.path.exists(_filename_wm_dl)
|
||||
if file_exists and dl_exists:
|
||||
os.remove(filename_wm)
|
||||
elif not file_exists and not dl_exists:
|
||||
raise FileNotFoundError
|
||||
if dl_exists:
|
||||
os.rename(_filename_wm_dl, filename_wm)
|
||||
except FileNotFoundError:
|
||||
msg = f"Neither {filename_wm}, nor {_filename_wm_dl} exist"
|
||||
self._api.report_message(
|
||||
"download_mat_existing_file", msg, "error")
|
||||
except FileExistsError:
|
||||
msg = f"File {filename_wm} already exists, failed to rename"
|
||||
self._api.report_message(
|
||||
"download_mat_rename", msg, "error")
|
||||
except Exception as e:
|
||||
self.logger.exception("Unexpected exception while renaming WM preview")
|
||||
msg = f"Unexpected exception while renaming {_filename_wm_dl}\n{e}"
|
||||
self._api.report_message(
|
||||
"download_wm_exception", msg, "error")
|
||||
return resp
|
||||
|
||||
def get_config_param(self,
|
||||
name_param: str,
|
||||
name_group: str = "DEFAULT",
|
||||
fallback: Optional[Any] = None
|
||||
) -> Any:
|
||||
"""Safely read a value from config (regardless of setup env or not)."""
|
||||
|
||||
if self._env.config is None:
|
||||
return fallback
|
||||
return self._env.config.get(name_group, name_param, fallback=fallback)
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,870 @@
|
||||
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
"""This module contains the API Remote Control."""
|
||||
|
||||
import concurrent
|
||||
from concurrent.futures import CancelledError, Future, TimeoutError
|
||||
from dataclasses import dataclass
|
||||
from enum import IntEnum, unique
|
||||
from functools import partial
|
||||
import os
|
||||
from queue import Queue
|
||||
from threading import Event, Lock, Thread
|
||||
import time
|
||||
from typing import Callable, Dict, List, Optional, Any
|
||||
|
||||
from .addon import PoliigonAddon
|
||||
from .api import (
|
||||
ApiResponse,
|
||||
TIMEOUT,
|
||||
TIMEOUT_STREAM)
|
||||
from .api_remote_control_params import (
|
||||
AddonRemoteControlParams,
|
||||
ApiJobParams,
|
||||
ApiJobParamsDownloadAsset,
|
||||
ApiJobParamsDownloadThumb,
|
||||
ApiJobParamsDownloadWMPreview,
|
||||
ApiJobParamsGetCategories,
|
||||
ApiJobParamsGetUserData,
|
||||
ApiJobParamsGetDownloadPrefs,
|
||||
ApiJobParamsGetAvailablePlans,
|
||||
ApiJobParamsGetUpgradePlan,
|
||||
ApiJobParamsPutUpgradePlan,
|
||||
ApiJobParamsResumePlan,
|
||||
ApiJobParamsGetAssets,
|
||||
ApiJobParamsLogin,
|
||||
ApiJobParamsPurchaseAsset,
|
||||
CmdLoginMode
|
||||
)
|
||||
from .assets import AssetData
|
||||
|
||||
|
||||
@dataclass
|
||||
class ApiResponseNewJob(ApiResponse):
|
||||
# This class is deliberately empty.
|
||||
# It only serves the purpose of being able to identify the ApiResponse
|
||||
# returned from get_new_job_response() via instanceof().
|
||||
pass
|
||||
|
||||
|
||||
def get_new_job_response() -> ApiResponseNewJob:
|
||||
resp = ApiResponseNewJob(
|
||||
body={"data": []},
|
||||
ok=False,
|
||||
error="job waiting to execute"
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
@unique
|
||||
class JobType(IntEnum):
|
||||
LOGIN = 0
|
||||
GET_USER_DATA = 1 # credits, subscription, user info
|
||||
GET_CATEGORIES = 2
|
||||
GET_DOWNLOAD_PREFS = 3
|
||||
GET_AVAILABLE_PLANS = 4
|
||||
GET_UPGRADE_PLAN = 5
|
||||
PUT_UPGRADE_PLAN = 6
|
||||
RESUME_PLAN = 7
|
||||
GET_ASSETS = 10
|
||||
DOWNLOAD_THUMB = 11
|
||||
PURCHASE_ASSET = 12
|
||||
DOWNLOAD_ASSET = 13
|
||||
DOWNLOAD_WM_PREVIEW = 14,
|
||||
UNIT_TEST = 15,
|
||||
EXIT = 99999
|
||||
|
||||
|
||||
class ApiJob():
|
||||
"""Describes an ApiJob and gets passed through the queues,
|
||||
subsequentyly being processed in thread_schedule and thread_collect.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
job_type: JobType,
|
||||
params: Optional[ApiJobParams] = None,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
result: ApiResponse = None,
|
||||
future: Optional[Future] = None,
|
||||
timeout: Optional[float] = None
|
||||
):
|
||||
self.job_type = job_type
|
||||
self.params = params
|
||||
self.callback_cancel = callback_cancel
|
||||
self.callback_progress = callback_progress
|
||||
self.callback_done = callback_done
|
||||
if result is not None:
|
||||
self.result = result
|
||||
else:
|
||||
self.result = get_new_job_response()
|
||||
self.future = future
|
||||
self.timeout = timeout
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.job_type == other.job_type and self.params == other.params
|
||||
|
||||
def __str__(self) -> str:
|
||||
return f"{self.job_type.name}\n{str(self.params)}"
|
||||
|
||||
|
||||
class ApiRemoteControl():
|
||||
|
||||
def __init__(self, addon: PoliigonAddon):
|
||||
# Only members defined in addon_core.PoliigonAddon are allowed to be
|
||||
# used inside this module
|
||||
self._addon = addon
|
||||
self._addon_params = AddonRemoteControlParams()
|
||||
self._tm = addon._tm
|
||||
self._api = addon._api
|
||||
self._asset_index = addon._asset_index
|
||||
|
||||
is_dev = addon._env.env_name != "prod"
|
||||
self.logger = self._addon.log_manager.initialize_logger(
|
||||
"APIRC", have_filehandler=is_dev)
|
||||
|
||||
self.queue_jobs = Queue()
|
||||
self._start_thread_schedule()
|
||||
self.queue_jobs_done = Queue()
|
||||
self._start_thread_collect()
|
||||
|
||||
self._start_thread_watchdog()
|
||||
|
||||
self.lock_jobs_in_flight = Lock()
|
||||
self.jobs_in_flight = {} # {job_type: [futures]}
|
||||
|
||||
self.in_shutdown = False
|
||||
|
||||
self.init_stats()
|
||||
|
||||
def init_stats(self) -> None:
|
||||
"""Initializes job statistics counters."""
|
||||
|
||||
self.cnt_added = {}
|
||||
self.cnt_queued = {}
|
||||
self.cnt_cancelled = {}
|
||||
self.cnt_exec = {}
|
||||
self.cnt_done = {}
|
||||
self.cnt_restart_schedule = 0
|
||||
self.cnt_restart_collect = 0
|
||||
for job_type in JobType.__members__.values():
|
||||
self.cnt_added[job_type] = 0
|
||||
self.cnt_queued[job_type] = 0
|
||||
self.cnt_cancelled[job_type] = 0
|
||||
self.cnt_exec[job_type] = 0
|
||||
self.cnt_done[job_type] = 0
|
||||
|
||||
def get_stats(self) -> Dict:
|
||||
"""Returns job statistics counters as a dictionary."""
|
||||
|
||||
stats = {}
|
||||
stats["Jobs added"] = self.cnt_added
|
||||
stats["Jobs queued"] = self.cnt_queued
|
||||
stats["Jobs cancelled"] = self.cnt_cancelled
|
||||
stats["Jobs exec"] = self.cnt_exec
|
||||
stats["Jobs done"] = self.cnt_done
|
||||
stats["Restart schedule"] = self.cnt_restart_schedule
|
||||
stats["Restart collect"] = self.cnt_restart_collect
|
||||
return stats
|
||||
|
||||
def _start_thread_schedule(self) -> None:
|
||||
self.schedule_running = False
|
||||
thd_schedule_report_wrapped = self._tm.reporting_callable(
|
||||
self._thread_schedule.__name__,
|
||||
self._thread_schedule)
|
||||
self.thd_schedule = Thread(target=thd_schedule_report_wrapped)
|
||||
self.thd_schedule.name = "API RC Schedule"
|
||||
self.thd_schedule.start()
|
||||
|
||||
def _start_thread_collect(self) -> None:
|
||||
self.collect_running = False
|
||||
thd_collect_report_wrapped = self._tm.reporting_callable(
|
||||
self._thread_collect.__name__,
|
||||
self._thread_collect)
|
||||
self.thd_collect = Thread(target=thd_collect_report_wrapped)
|
||||
self.thd_collect.name = "API RC Collect"
|
||||
self.thd_collect.start()
|
||||
|
||||
def _start_thread_watchdog(self) -> None:
|
||||
self.watchdog_running = False
|
||||
self.event_watchdog = Event()
|
||||
thd_watchdog_report_wrapped = self._tm.reporting_callable(
|
||||
self._thread_watchdog.__name__,
|
||||
self._thread_watchdog)
|
||||
self.thd_watchdog = Thread(target=thd_watchdog_report_wrapped)
|
||||
self.thd_watchdog.name = "API RC Watchdog"
|
||||
self.thd_watchdog.start()
|
||||
|
||||
def add_job_login(self,
|
||||
mode: CmdLoginMode = CmdLoginMode.LOGIN_BROWSER,
|
||||
email: Optional[str] = None,
|
||||
pwd: Optional[str] = None,
|
||||
time_since_enable: Optional[int] = None,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
"""Convenience function to add a login or logout job."""
|
||||
|
||||
if mode == CmdLoginMode.LOGOUT:
|
||||
self.empty_pipeline()
|
||||
else: # login
|
||||
self._asset_index.flush()
|
||||
|
||||
params = ApiJobParamsLogin(mode, email, pwd, time_since_enable)
|
||||
self.add_job(
|
||||
job_type=JobType.LOGIN,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def add_job_get_user_data(self,
|
||||
user_name: str,
|
||||
user_id: str,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
"""Convenience function to add a get user data job."""
|
||||
|
||||
params = ApiJobParamsGetUserData(user_name, user_id)
|
||||
self.add_job(
|
||||
job_type=JobType.GET_USER_DATA,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def add_job_get_download_prefs(self,
|
||||
*,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
"""Convenience function to get user download preferences."""
|
||||
|
||||
params = ApiJobParamsGetDownloadPrefs()
|
||||
self.add_job(
|
||||
job_type=JobType.GET_DOWNLOAD_PREFS,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def add_job_get_available_plans(self,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
params = ApiJobParamsGetAvailablePlans()
|
||||
self.add_job(
|
||||
job_type=JobType.GET_AVAILABLE_PLANS,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def add_job_get_upgrade_plan(self,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
params = ApiJobParamsGetUpgradePlan()
|
||||
self.add_job(
|
||||
job_type=JobType.GET_UPGRADE_PLAN,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def add_job_put_upgrade_plan(self,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
params = ApiJobParamsPutUpgradePlan()
|
||||
self.add_job(
|
||||
job_type=JobType.PUT_UPGRADE_PLAN,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def add_job_resume_plan(self,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
params = ApiJobParamsResumePlan()
|
||||
self.add_job(
|
||||
job_type=JobType.RESUME_PLAN,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def add_job_get_categories(self,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
"""Convenience function to add a get categories job."""
|
||||
|
||||
params = ApiJobParamsGetCategories()
|
||||
self.add_job(
|
||||
job_type=JobType.GET_CATEGORIES,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def add_job_get_assets(self,
|
||||
library_paths: List[str],
|
||||
tab: str, # KEY_TAB_ONLINE, KEY_TAB_MY_ASSETS
|
||||
category_list: List[str] = ["All Assets"],
|
||||
search: str = "",
|
||||
idx_page: int = 1,
|
||||
page_size: int = 10,
|
||||
force_request: bool = False,
|
||||
do_get_all: bool = True,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True,
|
||||
ignore_old_names: bool = True
|
||||
) -> None:
|
||||
"""Convenience function to add a get assets job."""
|
||||
|
||||
params = ApiJobParamsGetAssets(library_paths,
|
||||
tab,
|
||||
category_list,
|
||||
search,
|
||||
idx_page,
|
||||
page_size,
|
||||
force_request,
|
||||
do_get_all,
|
||||
ignore_old_names)
|
||||
self.add_job(
|
||||
job_type=JobType.GET_ASSETS,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def add_job_download_thumb(self,
|
||||
asset_id: int,
|
||||
url: str,
|
||||
path: str,
|
||||
idx_thumb: int = -1,
|
||||
do_update: bool = False,
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = False
|
||||
) -> None:
|
||||
"""Convenience function to add a download thumb job."""
|
||||
|
||||
params = ApiJobParamsDownloadThumb(
|
||||
asset_id, url, path, do_update, idx_thumb=idx_thumb)
|
||||
temp_path = f"{path}_temp"
|
||||
if not os.path.isfile(temp_path):
|
||||
fwriter = open(temp_path, "wb")
|
||||
fwriter.close()
|
||||
elif os.path.isfile(path):
|
||||
job = ApiJob(
|
||||
job_type=JobType.DOWNLOAD_THUMB,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done)
|
||||
callback_done(job=job)
|
||||
return
|
||||
else:
|
||||
return
|
||||
|
||||
self.add_job(
|
||||
job_type=JobType.DOWNLOAD_THUMB,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT_STREAM)
|
||||
|
||||
def add_job_purchase_asset(
|
||||
self,
|
||||
asset_data: AssetData,
|
||||
category_list: List[str] = ["All Assets"],
|
||||
search: str = "",
|
||||
job_download: Optional[Callable] = None, # type: ApiJob
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
"""Convenience function to add a purchase asset job."""
|
||||
|
||||
params = ApiJobParamsPurchaseAsset(asset_data,
|
||||
category_list,
|
||||
search,
|
||||
job_download)
|
||||
self.add_job(
|
||||
job_type=JobType.PURCHASE_ASSET,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT)
|
||||
|
||||
def create_job_download_asset(self,
|
||||
asset_data: AssetData,
|
||||
size: str = "2K",
|
||||
size_bg: str = "",
|
||||
type_bg: str = "EXR",
|
||||
lod: str = "NONE",
|
||||
variant: str = "",
|
||||
download_lods: bool = False,
|
||||
native_mesh: bool = True,
|
||||
renderer: str = "",
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None
|
||||
) -> ApiJob:
|
||||
"""Convenience function to add a download asset job."""
|
||||
|
||||
params = ApiJobParamsDownloadAsset(
|
||||
self._addon, asset_data, size, size_bg, type_bg, lod, variant,
|
||||
download_lods, native_mesh, renderer)
|
||||
job = ApiJob(
|
||||
job_type=JobType.DOWNLOAD_ASSET,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
timeout=TIMEOUT_STREAM
|
||||
)
|
||||
|
||||
# Due to the limitation of the number of threads, the download thread
|
||||
# may not start immediately. In that case it would seem, as if nothing
|
||||
# is happening.
|
||||
asset_data.state.dl.start()
|
||||
if callback_progress is not None:
|
||||
callback_progress(job)
|
||||
|
||||
return job
|
||||
|
||||
def add_job_download_asset(self,
|
||||
asset_data: AssetData,
|
||||
size: str = "2K",
|
||||
size_bg: str = "",
|
||||
type_bg: str = "EXR",
|
||||
lod: str = "NONE",
|
||||
variant: str = "",
|
||||
download_lods: bool = False,
|
||||
native_mesh: bool = True,
|
||||
renderer: str = "",
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
"""Convenience function to add a download asset job."""
|
||||
|
||||
self.cnt_added[JobType.DOWNLOAD_ASSET] += 1
|
||||
job = self.create_job_download_asset(
|
||||
asset_data,
|
||||
size,
|
||||
size_bg,
|
||||
type_bg,
|
||||
lod,
|
||||
variant,
|
||||
download_lods,
|
||||
native_mesh,
|
||||
renderer,
|
||||
callback_cancel,
|
||||
callback_progress,
|
||||
callback_done
|
||||
)
|
||||
self.enqueue_job(job, force)
|
||||
|
||||
def add_job_download_wm_preview(
|
||||
self,
|
||||
asset_data: AssetData,
|
||||
renderer: str = "",
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True
|
||||
) -> None:
|
||||
"""Convenience function to add a download WM preview job."""
|
||||
|
||||
params = ApiJobParamsDownloadWMPreview(asset_data,
|
||||
renderer)
|
||||
self.add_job(
|
||||
job_type=JobType.DOWNLOAD_WM_PREVIEW,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
force=force,
|
||||
timeout=TIMEOUT_STREAM)
|
||||
|
||||
def add_job_exit(self) -> None:
|
||||
"""Convenience function to add an APIRC exit job."""
|
||||
|
||||
job = ApiJob(
|
||||
job_type=JobType.EXIT,
|
||||
params={},
|
||||
callback_cancel=None,
|
||||
callback_progress=None,
|
||||
callback_done=None)
|
||||
# Enqueue directly, as actual enqueue_jobs() gets disabled before
|
||||
# shutdown
|
||||
self.queue_jobs.put(job)
|
||||
|
||||
def _is_job_already_enqueued(self, job: ApiJob) -> bool:
|
||||
"""Returns True, if an identical job exists already."""
|
||||
|
||||
with self.lock_jobs_in_flight:
|
||||
jobs_in_flight_copy = self.jobs_in_flight.copy()
|
||||
|
||||
try:
|
||||
return job in jobs_in_flight_copy[job.job_type]
|
||||
except KeyError:
|
||||
return False
|
||||
|
||||
def enqueue_job(self, job: ApiJob, force: bool = True) -> None:
|
||||
"""Enqueúes a single ApiJob.
|
||||
|
||||
Arguments:
|
||||
force: Default True, False: Enqueue only, if not queued already
|
||||
"""
|
||||
|
||||
if not force and self._is_job_already_enqueued(job):
|
||||
return
|
||||
|
||||
self.cnt_queued[job.job_type] += 1
|
||||
self.queue_jobs.put(job)
|
||||
|
||||
def add_job(self,
|
||||
job_type: JobType,
|
||||
params: Any = {}, # Class from API RC Params
|
||||
callback_cancel: Optional[Callable] = None,
|
||||
callback_progress: Optional[Callable] = None,
|
||||
callback_done: Optional[Callable] = None,
|
||||
force: bool = True,
|
||||
timeout: Optional[float] = None
|
||||
) -> None:
|
||||
"""Adds a job to be processed by API remote control."""
|
||||
|
||||
self.cnt_added[job_type] += 1
|
||||
|
||||
job = ApiJob(
|
||||
job_type=job_type,
|
||||
params=params,
|
||||
callback_cancel=callback_cancel,
|
||||
callback_progress=callback_progress,
|
||||
callback_done=callback_done,
|
||||
timeout=timeout)
|
||||
self.enqueue_job(job, force)
|
||||
|
||||
def _release_job(self, job: ApiJob) -> None:
|
||||
"""Removes a finished job from 'in flight' list."""
|
||||
|
||||
try:
|
||||
with self.lock_jobs_in_flight:
|
||||
self.jobs_in_flight[job.job_type].remove(job)
|
||||
except (KeyError, ValueError):
|
||||
pass # List of job type not found or job not found in list
|
||||
|
||||
def enqueue_job_shutdown(self, job: ApiJob, force: bool = True):
|
||||
"""Used to replace enqueue_job() method during shutdown to avoid any
|
||||
new jobs being enqueued.
|
||||
|
||||
Function is deliberately empty!
|
||||
"""
|
||||
pass
|
||||
|
||||
def empty_pipeline(self) -> None:
|
||||
"""Gets rid of any jobs in API RC's pipeline.
|
||||
|
||||
In what way is this different to wait_for_all()?
|
||||
wait_for_all() will get rid of all (or just a single type) jobs
|
||||
currently in the pipeline. But it does make no attempt to avoid new
|
||||
jobs being added. For example getting user data spawns five different
|
||||
follow up jobs, some of those then spawning more follow up jobs (e.g.
|
||||
the get asstes ones). empty_pipeline()
|
||||
"""
|
||||
|
||||
# Have all jobs already in thread pool exit as early as possible
|
||||
self.in_shutdown = True
|
||||
|
||||
# Prevent any new jobs from being queued
|
||||
f_enqueue_job_bak = self.enqueue_job
|
||||
self.enqueue_job = self.enqueue_job_shutdown
|
||||
# Empty job queue to prevent new jobs from being scheduled in
|
||||
# thread pool
|
||||
while not self.queue_jobs.empty():
|
||||
self.queue_jobs.get_nowait()
|
||||
|
||||
# Schedule point to allow threads to run home
|
||||
# Sleep should be short, but longer than OS's tick
|
||||
time.sleep(0.050) # 50 ms
|
||||
|
||||
self.wait_for_all(timeout=None)
|
||||
|
||||
# Re-enable normal operation
|
||||
self.enqueue_job = f_enqueue_job_bak
|
||||
self.in_shutdown = False
|
||||
|
||||
def shutdown(self) -> None:
|
||||
"""Stops remote control's threads."""
|
||||
|
||||
# Tear watchdog down, first (we do not want it to restart anything)
|
||||
self.watchdog_running = False
|
||||
self.event_watchdog.set()
|
||||
self.thd_watchdog.join()
|
||||
# Have all jobs already in thread pool exit as early as possible
|
||||
self.in_shutdown = True
|
||||
# Prevent any new jobs from being queued
|
||||
self.enqueue_job = self.enqueue_job_shutdown
|
||||
# Empty job queue to prevent new jobs from being scheduled in
|
||||
# thread pool
|
||||
while not self.queue_jobs.empty():
|
||||
self.queue_jobs.get_nowait()
|
||||
# Enqueue the "exit job", which will lead to _thread_schedule and
|
||||
# _thread_collect to exit
|
||||
self.add_job_exit()
|
||||
# Lastly wait for eveything to come to halt.
|
||||
# timeout=None, use job type specific timeouts
|
||||
self.wait_for_all(timeout=None)
|
||||
|
||||
def _wait_for_type(self,
|
||||
jobs_in_flight_copy: Dict,
|
||||
job_type: JobType,
|
||||
do_wait: bool,
|
||||
timeout: Optional[int]
|
||||
) -> None:
|
||||
"""Cancels all jobs of given type, optionally waits until cancelled."""
|
||||
|
||||
for job in jobs_in_flight_copy[job_type]:
|
||||
try:
|
||||
with self.lock_jobs_in_flight:
|
||||
self.jobs_in_flight[job.job_type].remove(job)
|
||||
except (KeyError, AttributeError):
|
||||
pass
|
||||
|
||||
if job.result is None:
|
||||
job.result = ApiResponse(ok=True,
|
||||
body={"data": []},
|
||||
error="job cancelled")
|
||||
|
||||
if job.future is None:
|
||||
self.logger.warning(f"Future is None: {job.job_type.name}")
|
||||
continue
|
||||
elif job.future.cancel():
|
||||
self.cnt_cancelled[job.job_type] += 1
|
||||
continue
|
||||
try:
|
||||
job.callback_cancel()
|
||||
except TypeError:
|
||||
pass # Not every job has a cancel callback
|
||||
if do_wait:
|
||||
try:
|
||||
if timeout is None:
|
||||
timeout = job.timeout
|
||||
job.future.result(timeout)
|
||||
except (CancelledError,
|
||||
TimeoutError,
|
||||
concurrent.futures._base.TimeoutError) as e:
|
||||
msg = ("API RC's job did not return upon cancel. "
|
||||
f"Timeout: {timeout}\nJob: {str(job)}")
|
||||
self.logger.exception(msg)
|
||||
self._addon._api.report_exception(e)
|
||||
|
||||
def wait_for_all(self,
|
||||
job_type: Optional[JobType] = None,
|
||||
do_wait: bool = True,
|
||||
timeout: Optional[int] = None
|
||||
) -> None:
|
||||
"""Cancels all jobs or just a given type, optionally waits until
|
||||
cancelled.
|
||||
|
||||
Arguments:
|
||||
job_type: Specify to cancel jobs of a certain type, None for all types
|
||||
do_wait: Set to True to wait for cancellation, otherwise just cancel
|
||||
and return immediately.
|
||||
timeout: Time to wait for futures to finish. If None,
|
||||
job type specific timeouts will be used (defined in
|
||||
add_job_xyz() functions below).
|
||||
"""
|
||||
|
||||
with self.lock_jobs_in_flight:
|
||||
jobs_in_flight_copy = self.jobs_in_flight.copy()
|
||||
|
||||
if job_type is None:
|
||||
for job_type in jobs_in_flight_copy:
|
||||
self._wait_for_type(
|
||||
jobs_in_flight_copy, job_type, do_wait, timeout)
|
||||
elif job_type in jobs_in_flight_copy:
|
||||
self._wait_for_type(
|
||||
jobs_in_flight_copy, job_type, do_wait, timeout)
|
||||
|
||||
def is_job_type_active(self, job_type: JobType) -> bool:
|
||||
"""Returns True if there's at least one job of given type in flight."""
|
||||
|
||||
return len(self.jobs_in_flight.get(job_type, [])) > 0
|
||||
|
||||
def _thread_schedule(self) -> None:
|
||||
"""Thread waiting on job queue to start jobs in thread pool."""
|
||||
|
||||
self.schedule_running = True
|
||||
while self.schedule_running:
|
||||
job = self.queue_jobs.get()
|
||||
|
||||
self.cnt_exec[job.job_type] += 1
|
||||
|
||||
if job.job_type != JobType.EXIT:
|
||||
with self.lock_jobs_in_flight:
|
||||
try:
|
||||
self.jobs_in_flight[job.job_type].append(job)
|
||||
except KeyError:
|
||||
self.jobs_in_flight[job.job_type] = [job]
|
||||
|
||||
if job.job_type != JobType.EXIT:
|
||||
job.future = self._tm.queue_thread(
|
||||
job.params.thread_execute,
|
||||
job.params.POOL_KEY,
|
||||
max_threads=None,
|
||||
foreground=False,
|
||||
api_rc=self,
|
||||
job=job
|
||||
)
|
||||
else:
|
||||
# JobType.EXIT
|
||||
self.queue_jobs_done.put(job) # stop collector
|
||||
self.schedule_running = False
|
||||
|
||||
def callback_enqueue_done(fut, job: ApiJob) -> None:
|
||||
self.queue_jobs_done.put(job)
|
||||
|
||||
cb_done = partial(callback_enqueue_done, job=job)
|
||||
try:
|
||||
job.future.add_done_callback(cb_done)
|
||||
except AttributeError as e:
|
||||
# JobType.EXIT has no Future
|
||||
if job.job_type != JobType.EXIT:
|
||||
msg = f"Job {job.job_type.name} has no Future"
|
||||
self.logger.exception(msg)
|
||||
self._addon._api.report_exception(e)
|
||||
|
||||
def _thread_collect(self) -> None:
|
||||
"""Thread awaiting threaded jobs to finish, then executes job's post
|
||||
processing.
|
||||
"""
|
||||
|
||||
self.collect_running = True
|
||||
while self.collect_running:
|
||||
job = self.queue_jobs_done.get()
|
||||
|
||||
if job.job_type != JobType.EXIT:
|
||||
try:
|
||||
job.params.finish(self, job)
|
||||
except BaseException as e:
|
||||
# Finish handlers are not allowed to tear down our
|
||||
# collect thread...
|
||||
msg = ("A job's finish function failed unexpectedly: "
|
||||
f"{str(job)}")
|
||||
self.logger.exception(msg)
|
||||
self._addon._api.report_exception(e)
|
||||
else:
|
||||
# JobType.EXIT
|
||||
self.collect_running = False
|
||||
break
|
||||
|
||||
try:
|
||||
job.callback_done(job=job)
|
||||
except TypeError:
|
||||
pass # There is no done callback
|
||||
except BaseException as e:
|
||||
# Done callbacksare not allowed to tear down our
|
||||
# collect thread...
|
||||
msg = ("A job's done callback failed unexpectedly: "
|
||||
f"{str(job)}")
|
||||
self.logger.exception(msg)
|
||||
self._addon._api.report_exception(e)
|
||||
|
||||
self._release_job(job)
|
||||
self.cnt_done[job.job_type] += 1
|
||||
|
||||
def _thread_watchdog(self) -> None:
|
||||
self.watchdog_running = True
|
||||
while self.watchdog_running:
|
||||
# Event used as a sleep (will only get set during shutdown)
|
||||
self.event_watchdog.wait(1.0)
|
||||
|
||||
if not self.watchdog_running:
|
||||
break
|
||||
|
||||
if not self.thd_schedule.is_alive():
|
||||
self.cnt_restart_schedule += 1
|
||||
self._start_thread_schedule()
|
||||
|
||||
msg = f"API RC's schedule failed ({self.cnt_restart_schedule})"
|
||||
self._addon._api.report_message(
|
||||
"apirc_thread_failure_schedule", msg, "error")
|
||||
self.logger.critical(msg)
|
||||
|
||||
if not self.thd_collect.is_alive():
|
||||
self.cnt_restart_collect += 1
|
||||
self._start_thread_collect()
|
||||
|
||||
msg = f"API RC's collect failed ({self.cnt_restart_collect})"
|
||||
self._addon._api.report_message(
|
||||
"apirc_thread_failure_collect", msg, "error")
|
||||
self.logger.critical(msg)
|
||||
+1343
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,817 @@
|
||||
# ##### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import os
|
||||
from typing import Callable, Dict, List, Optional, Tuple
|
||||
from dataclasses import dataclass
|
||||
from concurrent.futures import (Future,
|
||||
ThreadPoolExecutor)
|
||||
from xml.etree import ElementTree
|
||||
import errno
|
||||
import time
|
||||
from threading import Lock
|
||||
|
||||
from .api import (ApiResponse,
|
||||
DownloadStatus,
|
||||
DQStatus,
|
||||
ERR_OS_NO_PERMISSION,
|
||||
ERR_URL_EXPIRED,
|
||||
ERR_OS_NO_SPACE,
|
||||
ERR_LIMIT_DOWNLOAD_RATE)
|
||||
from .assets import (AssetType,
|
||||
AssetData,
|
||||
PREVIEWS)
|
||||
|
||||
|
||||
DOWNLOAD_TEMP_SUFFIX = "dl"
|
||||
|
||||
DOWNLOAD_POLL_INTERVAL = 0.25
|
||||
MAX_DOWNLOAD_RETRIES = 10
|
||||
MAX_PARALLEL_ASSET_DOWNLOADS = 2
|
||||
MAX_PARALLEL_DOWNLOADS_PER_ASSET = 8
|
||||
SIZE_DEFAULT_POOL = 10
|
||||
MAX_RETRIES_PER_FILE = 3
|
||||
MAX_RETRIES_PER_ASSET = 2
|
||||
|
||||
# This list defines a priority to fallback available formats
|
||||
# NOTE: This is only for Convention 1 downloads
|
||||
SUPPORTED_TEX_FORMATS = ["jpg", "png", "tiff", "exr"]
|
||||
MODEL_FILE_EXT = ["fbx", "blend", "max", "c4d", "skp", "ma"]
|
||||
|
||||
|
||||
class DownloadTimer():
|
||||
start_time: float
|
||||
end_time: float
|
||||
duration: float
|
||||
|
||||
def start(self) -> None:
|
||||
self.start_time = time.monotonic()
|
||||
|
||||
def end(self) -> float:
|
||||
self.end_time = time.monotonic()
|
||||
self.duration = self.end_time - self.start_time
|
||||
return self.duration
|
||||
|
||||
|
||||
@dataclass
|
||||
class DynamicFile:
|
||||
name: Optional[str]
|
||||
contents: Optional[str]
|
||||
|
||||
|
||||
@dataclass
|
||||
class FileDownload:
|
||||
asset_id: int
|
||||
url: str
|
||||
filename: str
|
||||
convention: int
|
||||
size_expected: int
|
||||
size_downloaded: int = 0
|
||||
resolution_size: str = None
|
||||
status: DownloadStatus = DownloadStatus.INITIALIZED
|
||||
directory: str = ""
|
||||
fut: Optional[Future] = None
|
||||
duration: float = -1.0 # avoid div by zero, but result stays clearly wrong
|
||||
lock: Lock = Lock()
|
||||
max_retries: int = MAX_RETRIES_PER_FILE
|
||||
retries: int = 0
|
||||
error: Optional[str] = None
|
||||
cf_ray: Optional[str] = None
|
||||
|
||||
def do_retry(self) -> bool:
|
||||
return self.retries < self.max_retries
|
||||
|
||||
def get_path(self, temp: bool = False) -> str:
|
||||
directory = self.directory
|
||||
return os.path.join(directory, self.get_filename(temp))
|
||||
|
||||
def get_filename(self, temp: bool = False) -> str:
|
||||
if temp:
|
||||
return self.filename + DOWNLOAD_TEMP_SUFFIX
|
||||
else:
|
||||
return self.filename
|
||||
|
||||
def set_status_cancelled(self) -> None:
|
||||
# do not overwrite final states
|
||||
with self.lock:
|
||||
is_done = self.status == DownloadStatus.DONE
|
||||
has_error = self.status == DownloadStatus.ERROR
|
||||
if not is_done and not has_error:
|
||||
self.status = DownloadStatus.CANCELLED
|
||||
|
||||
def set_status_ongoing(self) -> bool:
|
||||
res = True
|
||||
# do not overwrite user cancellation
|
||||
with self.lock:
|
||||
if self.status != DownloadStatus.CANCELLED:
|
||||
self.status = DownloadStatus.ONGOING
|
||||
else:
|
||||
res = False
|
||||
return res
|
||||
|
||||
def set_status_error(self) -> None:
|
||||
with self.lock:
|
||||
self.status = DownloadStatus.ERROR
|
||||
|
||||
def set_status_done(self) -> None:
|
||||
with self.lock:
|
||||
self.status = DownloadStatus.DONE
|
||||
|
||||
|
||||
class AssetDownload:
|
||||
addon = None # PoliigonAddon - No typing due to circular import
|
||||
data_payload: Dict
|
||||
|
||||
tpe: ThreadPoolExecutor
|
||||
|
||||
asset_data: AssetData = None
|
||||
uuid: Optional[str] = None
|
||||
download_list: Optional[List[FileDownload]] = None
|
||||
dynamic_files_list: Optional[List[DynamicFile]] = None
|
||||
size_asset_bytes_expected: int = 0
|
||||
size_asset_bytes_downloaded: int = 0
|
||||
|
||||
# Directory (named after the size) to be used for convention 1 assets
|
||||
dir_size_target: str = None
|
||||
|
||||
# Status flags
|
||||
max_retries: int = MAX_RETRIES_PER_ASSET
|
||||
retries: int = 0
|
||||
stop_files_retry: bool = False
|
||||
all_done: bool = False
|
||||
is_cancelled: bool = False
|
||||
any_error: bool = False
|
||||
res_error: Optional[str] = None
|
||||
res_error_message: Optional[str] = None
|
||||
error_dl_list: List[FileDownload] = list
|
||||
dl_error: Optional[FileDownload] = None
|
||||
|
||||
def __init__(self,
|
||||
addon, # PoliigonAddon - No typing due to circular import
|
||||
asset_data: AssetData,
|
||||
size: str,
|
||||
dir_target: str,
|
||||
lod: str = "NONE",
|
||||
download_lods: bool = False,
|
||||
native_mesh: bool = False,
|
||||
renderer: Optional[str] = None,
|
||||
update_callback: Optional[Callable] = None
|
||||
) -> None:
|
||||
self.addon = addon
|
||||
self.asset_data = asset_data
|
||||
self.size = size
|
||||
self.lod = lod
|
||||
self.download_lods = download_lods
|
||||
self.native_mesh = native_mesh
|
||||
self.renderer = renderer
|
||||
self.update_callback = update_callback
|
||||
self.dir_target = os.path.join(dir_target, asset_data.asset_name)
|
||||
self.download_list = []
|
||||
|
||||
self.convention = self.asset_data.get_convention()
|
||||
self.type_data = self.asset_data.get_type_data()
|
||||
self.workflow = self.type_data.get_workflow()
|
||||
|
||||
self.identified_previews = 0
|
||||
self.previews_reported = False
|
||||
|
||||
self.timer = DownloadTimer()
|
||||
|
||||
def kickoff_download(self) -> bool:
|
||||
self.set_data_payload()
|
||||
self.create_download_folder()
|
||||
|
||||
self.tpe = ThreadPoolExecutor(max_workers=MAX_PARALLEL_DOWNLOADS_PER_ASSET)
|
||||
|
||||
self.run_asset_download_retries(self.download_asset)
|
||||
return self.all_done and not self.is_cancelled
|
||||
|
||||
def download_asset(self) -> None:
|
||||
self.get_download_list()
|
||||
|
||||
if self.download_list in [None, []]:
|
||||
self.any_error = True
|
||||
if self.res_error is not None:
|
||||
err = self.res_error
|
||||
else:
|
||||
err = "Empty download list"
|
||||
|
||||
msg = f"AssetId: {self.asset_data.asset_id} Error: {err}"
|
||||
self.asset_data.state.dl.set_error(error_msg=err)
|
||||
self.addon.logger_dl.error(msg)
|
||||
|
||||
# Only report to Sentry if the error is not Max Download rate
|
||||
if err != ERR_LIMIT_DOWNLOAD_RATE:
|
||||
self.addon._api.report_message("download_asset_empty_download_list",
|
||||
msg,
|
||||
"error")
|
||||
return
|
||||
|
||||
self.download_loop()
|
||||
|
||||
def set_data_payload(self) -> None:
|
||||
self.data_payload = {
|
||||
"assets": [
|
||||
{
|
||||
"id": self.asset_data.asset_id,
|
||||
"name": self.asset_data.asset_name
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
if self.convention == 0:
|
||||
self.data_payload["assets"][0]["sizes"] = [self.size]
|
||||
elif self.convention == 1:
|
||||
self.data_payload["assets"][0]["resolution"] = self.size
|
||||
|
||||
if self.asset_data.asset_type in [AssetType.HDRI, AssetType.TEXTURE]:
|
||||
self.set_texture_payload()
|
||||
elif self.asset_data.asset_type == AssetType.MODEL:
|
||||
self.set_model_payload()
|
||||
|
||||
def set_texture_payload(self) -> None:
|
||||
if self.convention == 0:
|
||||
map_codes = self.type_data.get_map_type_code_list(self.workflow)
|
||||
self.data_payload["assets"][0]["workflows"] = [self.workflow]
|
||||
self.data_payload["assets"][0]["type_codes"] = map_codes
|
||||
elif self.convention == 1:
|
||||
prefs_available = self.addon.user.map_preferences is not None
|
||||
use_prefs = self.addon.user.use_preferences_on_download
|
||||
if prefs_available and use_prefs:
|
||||
map_descs, _ = self.type_data.get_maps_per_preferences(
|
||||
self.addon.user.map_preferences,
|
||||
filter_extensions=True)
|
||||
else:
|
||||
map_descs = self.type_data.map_descs[self.workflow]
|
||||
|
||||
map_list = []
|
||||
for _map_desc in map_descs:
|
||||
file_format = "UNKNOWN"
|
||||
for _ff in SUPPORTED_TEX_FORMATS:
|
||||
if _ff in _map_desc.file_formats:
|
||||
file_format = _ff
|
||||
break
|
||||
if file_format == "UNKNOWN":
|
||||
map_name = _map_desc.display_name
|
||||
msg = (f"UNKNWOWN file format for download; "
|
||||
f"Asset Id: {self.asset_data.asset_id} Map: {map_name}")
|
||||
self.addon._api.report_message(
|
||||
"download_invalid_format", msg, "error")
|
||||
self.addon.logger_dl.error(msg)
|
||||
|
||||
map_dict = {
|
||||
"type": _map_desc.map_type_code,
|
||||
"format": file_format
|
||||
}
|
||||
map_list.append(map_dict)
|
||||
self.data_payload["assets"][0]["maps"] = map_list
|
||||
|
||||
def set_model_payload(self) -> None:
|
||||
self.data_payload["assets"][0]["lods"] = int(self.download_lods)
|
||||
|
||||
if self.native_mesh and self.renderer is not None:
|
||||
self.data_payload["assets"][0]["softwares"] = [self.addon._api.software_dl_dcc]
|
||||
self.data_payload["assets"][0]["renders"] = [self.renderer]
|
||||
else:
|
||||
self.data_payload["assets"][0]["softwares"] = ["ALL_OTHERS"]
|
||||
|
||||
def create_download_folder(self) -> bool:
|
||||
try:
|
||||
os.makedirs(self.dir_target, exist_ok=True)
|
||||
except PermissionError:
|
||||
self.asset_data.state.dl.set_error(error_msg=ERR_OS_NO_PERMISSION)
|
||||
self.addon.logger_dl.exception(
|
||||
f"{ERR_OS_NO_PERMISSION}: {self.dir_target}")
|
||||
return False
|
||||
except OSError as e:
|
||||
self.asset_data.state.dl.set_error(error_msg=str(e))
|
||||
self.addon.logger_dl.exception(f"Download directory: {self.dir_target}")
|
||||
return False
|
||||
|
||||
self.addon.logger_dl.debug(f"Download directory: {self.dir_target}")
|
||||
self.asset_data.state.dl.set_directory(self.dir_target)
|
||||
|
||||
# For convention 1 it should be saved in a size folder
|
||||
if self.size is not None and self.convention == 1:
|
||||
self.dir_size_target = os.path.join(self.dir_target, self.size)
|
||||
if not os.path.isdir(self.dir_size_target):
|
||||
os.makedirs(self.dir_size_target, exist_ok=True)
|
||||
|
||||
return True
|
||||
|
||||
def get_download_list(self) -> None:
|
||||
self.timer.start()
|
||||
# Getting dl list (FileDownload) and total bytes size
|
||||
res = self.addon._api.download_asset_get_urls(self.data_payload)
|
||||
if not res.ok:
|
||||
self.res_error = res.error
|
||||
custom_msg = res.body.get("errors", {}).get("message", [])
|
||||
if len(custom_msg) > 0:
|
||||
self.res_error_message = custom_msg[0]
|
||||
return
|
||||
|
||||
self.build_download_list(res)
|
||||
|
||||
self.addon.logger_dl.info(
|
||||
f"=== Requesting URLs took {self.timer.end():.3f} s.")
|
||||
|
||||
def define_download_folder(self, filename: str) -> str:
|
||||
if self.convention == 0:
|
||||
return self.dir_target
|
||||
|
||||
dl_folder = self.dir_size_target
|
||||
base_filename, suffix = os.path.splitext(filename)
|
||||
base_filename_low = base_filename.lower()
|
||||
|
||||
last_preview = None
|
||||
for _preview in PREVIEWS:
|
||||
if base_filename_low.endswith(_preview):
|
||||
dl_folder = self.dir_target
|
||||
self.identified_previews += 1
|
||||
last_preview = filename
|
||||
|
||||
if self.identified_previews > 3 and not self.previews_reported:
|
||||
msg = (f"Identifying multiple Preview images for "
|
||||
f"Asset id: {self.asset_data.asset_id} (e.g {last_preview})")
|
||||
self.addon._api.report_message(
|
||||
"multiple_previews", msg, level="info")
|
||||
self.previews_reported = True
|
||||
|
||||
return dl_folder
|
||||
|
||||
def build_download_list(self, res: ApiResponse) -> None:
|
||||
files_list = res.body.get("files", [])
|
||||
self.uuid = res.body.get("uuid", None)
|
||||
if self.uuid in [None, ""]:
|
||||
self.addon.logger_dl.error("No UUID for download")
|
||||
|
||||
model_exists = False
|
||||
filename_model_fbx_source = None
|
||||
url_model_fbx_source = None
|
||||
size_expected_model_fbx_source = 0
|
||||
for url_dict in files_list:
|
||||
url = url_dict.get("url")
|
||||
filename = url_dict.get("name")
|
||||
size_expected = url_dict.get("bytes", 0)
|
||||
resolution_size = url_dict.get("resolution", None)
|
||||
|
||||
self.size_asset_bytes_expected += size_expected
|
||||
if not url or not filename:
|
||||
raise RuntimeError(f"Missing url or filename {url}")
|
||||
elif "_SOURCE" in filename:
|
||||
if filename.lower().endswith(".fbx"):
|
||||
filename_model_fbx_source = filename
|
||||
url_model_fbx_source = url
|
||||
size_expected_model_fbx_source = size_expected
|
||||
continue
|
||||
|
||||
filename_ext = os.path.splitext(filename)[1].lower()
|
||||
filename_ext = filename_ext[1:] # get rid of dot
|
||||
if filename_ext.lower() in MODEL_FILE_EXT:
|
||||
model_exists = True
|
||||
|
||||
dl = FileDownload(
|
||||
asset_id=self.asset_data.asset_id,
|
||||
url=url,
|
||||
filename=filename,
|
||||
convention=self.convention,
|
||||
size_expected=size_expected,
|
||||
resolution_size=resolution_size,
|
||||
directory=self.define_download_folder(filename))
|
||||
self.download_list.append(dl)
|
||||
|
||||
# Fallback if "xyz_SOURCE.fbx" is the only model file
|
||||
if filename_model_fbx_source is not None and not model_exists:
|
||||
dl = FileDownload(asset_id=self.asset_data.asset_id,
|
||||
url=url_model_fbx_source,
|
||||
filename=filename_model_fbx_source,
|
||||
convention=self.convention,
|
||||
size_expected=size_expected_model_fbx_source,
|
||||
directory=self.dir_target)
|
||||
self.download_list.append(dl)
|
||||
msg = f"Model asset with just SOURCE LOD: {self.asset_data.asset_id}"
|
||||
self.addon._api.report_message(
|
||||
"model_with_only_source_lod", msg, level="info")
|
||||
|
||||
self.set_dynamic_files(res.body.get("dynamic_files", None))
|
||||
|
||||
def set_dynamic_files(self,
|
||||
dynamic_files_api: Optional[List[Dict]]
|
||||
) -> None:
|
||||
"""Reads dynamic file information from server's API response."""
|
||||
|
||||
if dynamic_files_api is None:
|
||||
return
|
||||
|
||||
self.dynamic_files_list = []
|
||||
for _dynamic_file_dict in dynamic_files_api:
|
||||
name = _dynamic_file_dict.get("name", None)
|
||||
contents = _dynamic_file_dict.get("contents", None)
|
||||
dynamic_file = DynamicFile(name=name, contents=contents)
|
||||
self.dynamic_files_list.append(dynamic_file)
|
||||
|
||||
def download_loop(self) -> None:
|
||||
"""The actual download loop in download_asset_sync()."""
|
||||
|
||||
self.all_done = False
|
||||
self.addon.logger_dl.debug("Download Loop")
|
||||
|
||||
self.asset_data.state.dl.set_progress(0.001)
|
||||
if self.asset_data.state.dl.is_cancelled():
|
||||
self.is_cancelled = True
|
||||
self.cancel_downloads()
|
||||
return
|
||||
|
||||
self.schedule_downloads()
|
||||
self.download_asset_loop_poll()
|
||||
|
||||
if self.all_done:
|
||||
# Consider download failed upon dynamic file error.
|
||||
#
|
||||
# ATM we will not expose any issues with dynamic file data from server
|
||||
# and let the entire download succeed, anyway.
|
||||
self.all_done = self.store_dynamic_files(expose_api_error=False)
|
||||
self.rename_downloads()
|
||||
|
||||
return
|
||||
|
||||
def track_quality(self) -> None:
|
||||
if self.uuid in [None, ""]:
|
||||
return
|
||||
|
||||
if self.all_done:
|
||||
# User may still have cancelled download (judging by state in
|
||||
# asset data), but we suceeded anyway
|
||||
self.addon._api.track_download_quality(uuid=self.uuid,
|
||||
status=DQStatus.SUCCESS)
|
||||
elif self.is_cancelled and not self.any_error:
|
||||
self.addon._api.track_download_quality(uuid=self.uuid,
|
||||
status=DQStatus.CANCELED,
|
||||
error="User cancelled download")
|
||||
else:
|
||||
file_dl_error = self.dl_error
|
||||
if file_dl_error is None:
|
||||
return
|
||||
msg = (f"Error: {file_dl_error.error}, "
|
||||
f"File: {file_dl_error.url}, CF-ray: {file_dl_error.cf_ray}")
|
||||
self.addon._api.track_download_quality(uuid=self.uuid,
|
||||
status=DQStatus.FAILED,
|
||||
error=msg)
|
||||
|
||||
def schedule_downloads(self,
|
||||
download_list: Optional[List[FileDownload]] = None
|
||||
) -> None:
|
||||
"""Submits downloads to thread pool."""
|
||||
|
||||
if download_list is None:
|
||||
download_list = self.download_list
|
||||
self.addon.logger_dl.debug("Scheduling Downloads")
|
||||
|
||||
download_list.sort(key=lambda dl: dl.size_expected)
|
||||
for download in download_list:
|
||||
# Andreas: Could also check here, if already DONE and not start
|
||||
# the thread at all.
|
||||
# Yet, I decided to prefer it handled by the thread itself.
|
||||
# In this way the flow is always identical.
|
||||
download.status = DownloadStatus.WAITING
|
||||
download.retries += 1
|
||||
download.fut = self.tpe.submit(self.addon._api.download_asset_file,
|
||||
download=download)
|
||||
self.addon.logger_dl.debug(f"Submitted {download.filename}. "
|
||||
f"Retry: {download.retries}")
|
||||
self.addon.logger_dl.debug("Download Asset Schedule Done")
|
||||
|
||||
def check_download_progress(self) -> None:
|
||||
self.addon.logger_dl.debug(self.download_list)
|
||||
self.any_error = False
|
||||
self.error_dl_list = []
|
||||
self.is_cancelled = self.asset_data.state.dl.is_cancelled()
|
||||
|
||||
self.all_done = True
|
||||
self.size_asset_bytes_downloaded = 0
|
||||
for download in self.download_list:
|
||||
self.size_asset_bytes_downloaded += download.size_downloaded
|
||||
|
||||
fut = download.fut
|
||||
if not fut.done():
|
||||
self.all_done = False
|
||||
continue
|
||||
|
||||
res = fut.result()
|
||||
exc = fut.exception()
|
||||
res_error = res.error
|
||||
had_excp = exc is not None
|
||||
if not res.ok or had_excp:
|
||||
if had_excp:
|
||||
self.addon.logger_dl.error(exc)
|
||||
self.any_error = True
|
||||
self.all_done = False
|
||||
download.error = res_error
|
||||
self.asset_data.state.dl.set_error(error_msg=res_error)
|
||||
self.error_dl_list.append(download)
|
||||
|
||||
if self.any_error:
|
||||
self.process_file_retries()
|
||||
elif self.all_done:
|
||||
self.addon.logger_dl.debug("All Done :)")
|
||||
|
||||
self.set_progress()
|
||||
|
||||
def download_asset_loop_poll(self) -> None:
|
||||
"""Used in download_asset_sync to poll results inside download loop."""
|
||||
|
||||
self.addon.logger_dl.debug("Starting Download Poll Loop")
|
||||
while not self.all_done and not self.stop_files_retry and not self.is_cancelled:
|
||||
time.sleep(DOWNLOAD_POLL_INTERVAL)
|
||||
self.check_download_progress()
|
||||
|
||||
def process_file_retries(self) -> None:
|
||||
"""Manages the retries per file."""
|
||||
|
||||
for dl_error in self.error_dl_list:
|
||||
if not dl_error.do_retry():
|
||||
self.dl_error = dl_error
|
||||
self.stop_files_retry = True
|
||||
break
|
||||
|
||||
if not self.is_cancelled and not self.stop_files_retry:
|
||||
self.schedule_downloads(self.error_dl_list)
|
||||
return
|
||||
|
||||
self.cancel_downloads()
|
||||
|
||||
def cancel_downloads(self) -> None:
|
||||
"""Cancels all download threads"""
|
||||
|
||||
self.addon.logger_dl.debug("Start cancel")
|
||||
|
||||
for download in self.download_list:
|
||||
download.set_status_cancelled()
|
||||
if download.fut is not None:
|
||||
download.fut.cancel()
|
||||
|
||||
# Wait for threads to actually return
|
||||
self.addon.logger_dl.debug("Waiting")
|
||||
for download in self.download_list:
|
||||
if download.fut is None:
|
||||
continue
|
||||
if download.fut.cancelled():
|
||||
continue
|
||||
try:
|
||||
download.fut.result(timeout=60)
|
||||
except TimeoutError:
|
||||
# TODO(Andreas): Now there seems to be some real issue...
|
||||
msg = (f"Asset id {self.asset_data.asset_id} download file "
|
||||
"future Timeout error with no result.")
|
||||
self.addon._api.report_message("download_file_with_no_result",
|
||||
msg,
|
||||
"error")
|
||||
raise
|
||||
except BaseException:
|
||||
msg = (f"Asset id {self.asset_data.asset_id} download file "
|
||||
"exception with no result.")
|
||||
self.addon._api.report_message("download_file_with_no_result",
|
||||
msg,
|
||||
"error")
|
||||
self.addon.logger_dl.exception(f"Unexpected error: {msg}")
|
||||
raise
|
||||
|
||||
self.addon.logger_dl.debug("Done")
|
||||
|
||||
def set_progress(self) -> None:
|
||||
progress = self.size_asset_bytes_downloaded / max(self.size_asset_bytes_expected, 1)
|
||||
self.asset_data.state.dl.set_progress(max(progress, 0.001))
|
||||
self.asset_data.state.dl.set_downloaded_bytes(self.size_asset_bytes_expected)
|
||||
try: # Init progress bar
|
||||
self.update_callback()
|
||||
except TypeError:
|
||||
pass # No update callback
|
||||
|
||||
def do_retry(self) -> bool:
|
||||
first_attempt = self.retries == 0
|
||||
if first_attempt:
|
||||
return True
|
||||
do_retry = self.retries < self.max_retries
|
||||
expired_error = False
|
||||
if self.dl_error is not None:
|
||||
expired_error = self.dl_error.error == ERR_URL_EXPIRED
|
||||
|
||||
# Asset level download only retries in case of Expired URL
|
||||
return expired_error and do_retry
|
||||
|
||||
def run_asset_download_retries(self, method: callable) -> None:
|
||||
while self.do_retry():
|
||||
self.retries += 1
|
||||
method()
|
||||
|
||||
self.track_quality()
|
||||
self.cancel_downloads()
|
||||
|
||||
def rename_downloads(self) -> Tuple[bool, str]:
|
||||
"""Renames dowhloaded temp file."""
|
||||
self.addon.logger_dl.debug("Start rename")
|
||||
|
||||
error_msg = ""
|
||||
all_successful = True
|
||||
for download in self.download_list:
|
||||
if download.status != DownloadStatus.DONE:
|
||||
self.addon.logger_dl.warning(("File status not done despite "
|
||||
"all files reported done!"))
|
||||
path_temp = download.get_path(temp=True)
|
||||
temp_exists = os.path.exists(path_temp)
|
||||
path_final = download.get_path(temp=False)
|
||||
final_exists = os.path.exists(path_final)
|
||||
if not temp_exists and final_exists:
|
||||
continue
|
||||
|
||||
try:
|
||||
os.rename(path_temp, path_final)
|
||||
except FileExistsError:
|
||||
os.remove(path_temp)
|
||||
except FileNotFoundError:
|
||||
download.status = DownloadStatus.ERROR
|
||||
download.error = f"Missing file: {path_temp}"
|
||||
self.addon.logger_dl.error(
|
||||
("Neither temp download file nor target do exist\n"
|
||||
f" {path_temp}\n"
|
||||
f" {path_final}"))
|
||||
all_successful = False
|
||||
except PermissionError:
|
||||
# Note from Andreas:
|
||||
# I am not entirely sure, how this can happen (after all we
|
||||
# just downloaded the file...).
|
||||
# My assumption is, that somehow the download thread (while
|
||||
# already being done) did not actually exit, yet, maybe due to
|
||||
# some scheduling mishaps and is still keeping a handle to the
|
||||
# file. If I am correct, maybe a "sleep(0.1 sec)" and another
|
||||
# attempt to rename could get us out of this.
|
||||
# But that's of course pretty ugly and we should discuss
|
||||
# first, if we want to try something like this or just let
|
||||
# the download fail.
|
||||
download.status = DownloadStatus.ERROR
|
||||
download.error = ("Lacking permission to rename downloaded"
|
||||
f" file: {path_temp}")
|
||||
self.addon.logger_dl.error(
|
||||
(f"No permission to rename download:\n from: {path_temp}"
|
||||
f"\n to: {path_final}"))
|
||||
all_successful = False
|
||||
|
||||
# Gets the first error found to give feedback for the user
|
||||
if error_msg is not None and download.error not in [None, ""]:
|
||||
error_msg = download.error
|
||||
|
||||
self.addon.logger_dl.debug(f"Done, succeess = {all_successful}")
|
||||
return all_successful, error_msg
|
||||
|
||||
def _check_xml_data(self,
|
||||
xml_s: str,
|
||||
expose_api_error: bool = False
|
||||
) -> bool:
|
||||
"""Checks an XML string for correct XML structure."""
|
||||
|
||||
asset_data = self.asset_data
|
||||
asset_id = asset_data.asset_id
|
||||
|
||||
xml_ok = False
|
||||
try:
|
||||
# While we are not really interested in actual contents atm,
|
||||
# we parse the XML nevertheless to make sure it is "parseable".
|
||||
xml_root = ElementTree.XML(xml_s)
|
||||
if xml_root is not None:
|
||||
xml_ok = True
|
||||
except ElementTree.ParseError as e:
|
||||
if expose_api_error:
|
||||
asset_data.state.dl.set_error(error_msg="Dynamic file error")
|
||||
msg = (f"Could not save dynamic file for {asset_id}, "
|
||||
f"XML parsing issue\n{e}")
|
||||
self.addon.logger_dl.exception(msg)
|
||||
self.addon._api.report_message("download_df_xml_issue", msg, "error")
|
||||
if not xml_ok:
|
||||
return False # NOK reported above in exception
|
||||
|
||||
return True
|
||||
|
||||
def _check_dynamic_file_data(self,
|
||||
dynamic_file: DynamicFile,
|
||||
expose_api_error: bool = False
|
||||
) -> bool:
|
||||
"""Checks the dynamic file data (currently expecting XML) received
|
||||
from API.
|
||||
"""
|
||||
|
||||
asset_data = self.asset_data
|
||||
asset_id = asset_data.asset_id
|
||||
|
||||
if dynamic_file.name is None:
|
||||
if expose_api_error:
|
||||
asset_data.state.dl.set_error(error_msg="Dynamic file error")
|
||||
msg = (f"Could not save dynamic file for {asset_id}, "
|
||||
"no name provided")
|
||||
self.addon.logger_dl.error(msg)
|
||||
self.addon._api.report_message("download_df_no_filename", msg, "error")
|
||||
return False
|
||||
contents = dynamic_file.contents
|
||||
if contents is None:
|
||||
if expose_api_error:
|
||||
asset_data.state.dl.set_error(error_msg="Dynamic file error")
|
||||
msg = (f"Could not save dynamic file for {asset_id}, "
|
||||
"no contents provided")
|
||||
self.addon.logger_dl.error(msg)
|
||||
self.addon._api.report_message("download_df_no_contents", msg, "error")
|
||||
return False
|
||||
|
||||
return self._check_xml_data(contents, expose_api_error)
|
||||
|
||||
def _store_single_dynamic_file(self,
|
||||
dynamic_file: DynamicFile,
|
||||
expose_api_error: bool = False
|
||||
) -> bool:
|
||||
"""Stores a dynamic file (currently only XML data) to disk."""
|
||||
|
||||
asset_data = self.asset_data
|
||||
asset_id = asset_data.asset_id
|
||||
|
||||
result = self._check_dynamic_file_data(dynamic_file, expose_api_error)
|
||||
if not result:
|
||||
# Here download fails only, if exposure of errors in dynamic_file
|
||||
# data from server is desired.
|
||||
return not expose_api_error
|
||||
|
||||
# Since we need to store into the correct "size" subfolder and
|
||||
# also since MaterialX does have little sense, if there were no other
|
||||
# files downloaded, we'll use the path of the first FileDownload
|
||||
if len(self.download_list) > 0:
|
||||
file_download = self.download_list[0]
|
||||
path_file = file_download.get_path()
|
||||
dir_asset = os.path.dirname(path_file)
|
||||
else:
|
||||
# Without any file downloads, all we have here is asset's path:
|
||||
dir_asset = self.asset_data.state.dl.get_directory()
|
||||
path_dynamic_file = os.path.join(dir_asset, dynamic_file.name)
|
||||
|
||||
try:
|
||||
with open(path_dynamic_file, "w") as write_file:
|
||||
write_file.write(dynamic_file.contents)
|
||||
except OSError as e:
|
||||
if e.errno == errno.ENOSPC:
|
||||
asset_data.state.dl.set_error(error_msg=ERR_OS_NO_SPACE)
|
||||
msg = f"Asset {asset_id}: No space for dynamic file."
|
||||
# TODO(Andreas): No logger in PoliigonConnector, yet
|
||||
self.addon.logger_dl.exception(msg)
|
||||
self.addon._api.report_message("download_df_no_space", msg, "error")
|
||||
elif e.errno == errno.EACCES:
|
||||
asset_data.state.dl.set_error(
|
||||
error_msg=ERR_OS_NO_PERMISSION)
|
||||
msg = f"Asset {asset_id}: No permission to write dynamic file."
|
||||
# TODO(Andreas): No logger in PoliigonConnector, yet
|
||||
self.addon.logger_dl.exception(msg)
|
||||
self.addon._api.report_message("download_df_permission", msg, "error")
|
||||
else:
|
||||
asset_data.state.dl.set_error(error_msg=str(e))
|
||||
msg = (f"Asset {asset_id}: Unexpected error "
|
||||
"upon writing dynamic file.")
|
||||
# TODO(Andreas): No logger in PoliigonConnector, yet
|
||||
self.addon.logger_dl.logger.exception(msg)
|
||||
msg += f"\n{e}"
|
||||
self.addon._api.report_message("download_df_os_error", msg, "error")
|
||||
# Note: Even if dynamic file data issue above does not get exposed
|
||||
# to user, any failure to write the correct MaterialX data
|
||||
# will still lead to a failed download.
|
||||
return False
|
||||
return True
|
||||
|
||||
def store_dynamic_files(self,
|
||||
expose_api_error: bool = False
|
||||
) -> bool:
|
||||
"""Stores all dynamic files belonging to an asset download to disk."""
|
||||
|
||||
if self.dynamic_files_list is None:
|
||||
return True
|
||||
if len(self.dynamic_files_list) == 0:
|
||||
return True
|
||||
|
||||
# Note: We'll get here only after all asset files got downloaded
|
||||
# successfully. Thus we can store any dynamic files errors in
|
||||
# AssetData's download status (no need to be afraid of
|
||||
# overwriting any other error) to present in UI.
|
||||
for _dynamic_file in self.dynamic_files_list:
|
||||
result = self._store_single_dynamic_file(_dynamic_file,
|
||||
expose_api_error)
|
||||
if not result:
|
||||
return False
|
||||
return True
|
||||
@@ -0,0 +1,116 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import os
|
||||
from typing import Optional
|
||||
|
||||
try:
|
||||
import ConfigParser
|
||||
except Exception:
|
||||
import configparser as ConfigParser
|
||||
|
||||
|
||||
class PoliigonEnvironment():
|
||||
"""Poliigon environment used for assisting in program control flow."""
|
||||
|
||||
addon_name: str # e.g. poliigon-addon-blender
|
||||
base: str # Path to base directory of addon or package
|
||||
env_filename: str
|
||||
|
||||
config: ConfigParser.ConfigParser = None
|
||||
|
||||
# Required env fields
|
||||
api_url: str = ""
|
||||
api_url_v2: str = ""
|
||||
env_name: str = ""
|
||||
|
||||
required_attrs = ["api_url", "api_url_v2", "env_name"]
|
||||
|
||||
# Optional env fields
|
||||
host: str = ""
|
||||
forced_sampling: bool = False
|
||||
local_updater_json: Optional[str] = None
|
||||
|
||||
def __init__(self,
|
||||
addon_name: str,
|
||||
base: str = os.path.dirname(os.path.abspath(__file__)),
|
||||
env_filename: str = "env.ini"):
|
||||
self.addon_name = addon_name
|
||||
self.base = base
|
||||
self.env_filename = env_filename
|
||||
self._update_files(base)
|
||||
self._load_env(base, env_filename)
|
||||
|
||||
def _load_env(self, path, filename):
|
||||
env_file = os.path.join(path, filename)
|
||||
if os.path.exists(env_file):
|
||||
try:
|
||||
# Read .ini here and set values
|
||||
# https://docs.python.org/3/library/configparser.html#configparser.ConfigParser.optionxform
|
||||
config = ConfigParser.ConfigParser()
|
||||
config.optionxform = str
|
||||
config.read(env_file)
|
||||
|
||||
# Required fields
|
||||
self.config = config
|
||||
self.api_url = config.get("DEFAULT", "api_url", fallback="")
|
||||
self.api_url_v2 = config.get("DEFAULT", "api_url_v2", fallback="")
|
||||
self.env_name = config.get("DEFAULT", "env_name", fallback="")
|
||||
|
||||
for k, v in vars(self).items():
|
||||
if k in self.required_attrs and v in [None, ""]:
|
||||
raise ValueError(
|
||||
f"Attribute '{k}' missing from env file")
|
||||
|
||||
# Optional fields that should always be present
|
||||
self.host = config.get("DEFAULT", "host", fallback="")
|
||||
self.forced_sampling = config.getboolean(
|
||||
"DEFAULT", "forced_sampling", fallback=False)
|
||||
self.local_updater_json = config.get(
|
||||
"DEFAULT", "local_updater_json", fallback=None)
|
||||
|
||||
except ValueError as e:
|
||||
msg = f"Could not load environment file for {self.addon_name}"
|
||||
raise RuntimeError(msg) from e
|
||||
else:
|
||||
# Assume production environment and set fallback values
|
||||
self.api_url = "https://api.poliigon.com/api/v1"
|
||||
self.api_url_v2 = "https://apiv2.poliigon.com/api/v2"
|
||||
self.env_name = "prod"
|
||||
self.host = ""
|
||||
self.forced_sampling = False
|
||||
|
||||
def _update_files(self, path):
|
||||
"""Updates files in the specified path within the addon."""
|
||||
update_key = "_update"
|
||||
search_key = "env" + update_key
|
||||
files_to_update = [f for f in os.listdir(path)
|
||||
if os.path.isfile(os.path.join(path, f))
|
||||
and os.path.splitext(f)[0].endswith(search_key)]
|
||||
|
||||
for f in files_to_update:
|
||||
f_split = os.path.splitext(f)
|
||||
tgt_file = f_split[0][:-len(update_key)] + f_split[1]
|
||||
|
||||
try:
|
||||
os.replace(os.path.join(path, f), os.path.join(path, tgt_file))
|
||||
print(f"Updated {tgt_file}")
|
||||
except PermissionError as e:
|
||||
print(f"Encountered 'file_permission_error': {e}")
|
||||
except OSError as e:
|
||||
print(f"Encountered 'os_error': {e}")
|
||||
@@ -0,0 +1,238 @@
|
||||
import logging
|
||||
import os
|
||||
from io import StringIO
|
||||
from typing import Optional, List
|
||||
|
||||
from .env import PoliigonEnvironment
|
||||
|
||||
|
||||
# Numerical comment -> values for .ini
|
||||
NOT_SET = logging.NOTSET # 0
|
||||
DEBUG = logging.DEBUG # 10
|
||||
INFO = logging.INFO # 20
|
||||
WARNING = logging.WARNING # 30
|
||||
ERROR = logging.ERROR # 40
|
||||
CRITICAL = logging.CRITICAL # 50
|
||||
|
||||
|
||||
# Import and use get_addon_logger() to get hold of the logger manager
|
||||
addon_logger = None
|
||||
|
||||
|
||||
class MockLogger:
|
||||
"""Placeholder logger which accepts any arguments and does nothing"""
|
||||
|
||||
def __getattr__(self, method_name):
|
||||
# Names matching the built in logging class methods
|
||||
log_methods = ['critical', 'error', 'exception', 'fatal',
|
||||
'debug', 'info', 'log',
|
||||
'warn', 'warning']
|
||||
if method_name not in log_methods:
|
||||
raise RuntimeError("Invalid logger method")
|
||||
|
||||
def method(*args, **kwargs):
|
||||
return None
|
||||
return method
|
||||
|
||||
|
||||
class AddonLogger:
|
||||
"""Class to store all the data (created loggers, formatting information
|
||||
and handlers) and functionality related to the Addon Logs.
|
||||
|
||||
loggers: Stores all created loggers;
|
||||
dcc_handlers: Stores all handlers created on DCC side.
|
||||
For adding new Handlers, use the method set_dcc_handlers;
|
||||
write_to_file: Defines if the new loggers write the output into a .log file;
|
||||
log_file_path: The output path to create the .log file;
|
||||
|
||||
"""
|
||||
|
||||
loggers = []
|
||||
dcc_handlers = []
|
||||
|
||||
write_to_file: bool = False
|
||||
|
||||
str_format = ("%(name)s, %(levelname)s, %(threadName)s, %(asctime)s, "
|
||||
"%(filename)s/%(funcName)s:%(lineno)d: %(message)s")
|
||||
|
||||
date_format = "%I:%M:%S"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
env: Optional[PoliigonEnvironment] = None,
|
||||
file_path: Optional[str] = None):
|
||||
|
||||
self.addon_env = env
|
||||
self.file_handler = None
|
||||
self.stream_handler = None
|
||||
|
||||
addon_core_path = os.path.split(os.path.split(os.path.realpath(__file__))[0])[0]
|
||||
self.log_file_path = os.path.join(addon_core_path, "logs.log")
|
||||
|
||||
if file_path is not None:
|
||||
self.log_file_path = file_path
|
||||
self.write_to_file = True
|
||||
else:
|
||||
try:
|
||||
self.write_to_file = self.addon_env.config.getboolean(
|
||||
"DEFAULT", "log_to_file", fallback=False)
|
||||
except AttributeError:
|
||||
pass # no .ini
|
||||
|
||||
def _init_filehandler(self, have_filehandler: bool) -> None:
|
||||
"""Optionally initializes the default file handler."""
|
||||
|
||||
if not have_filehandler:
|
||||
self.file_handler = None
|
||||
return
|
||||
|
||||
log_file_exists = os.path.isfile(self.log_file_path)
|
||||
if log_file_exists:
|
||||
log_file_write_access = os.access(self.log_file_path, os.W_OK)
|
||||
else:
|
||||
log_file_write_access = True
|
||||
|
||||
log_dir = os.path.dirname(self.log_file_path)
|
||||
log_dir_exists = os.path.isdir(log_dir)
|
||||
if log_dir_exists:
|
||||
log_dir_write_access = os.access(log_dir, os.W_OK)
|
||||
else:
|
||||
log_dir_write_access = False
|
||||
|
||||
log_file_overwrite = log_file_exists and log_file_write_access
|
||||
log_file_create_allowed = log_dir_exists and log_dir_write_access and not log_file_exists
|
||||
if log_file_overwrite or log_file_create_allowed:
|
||||
if self.file_handler is None:
|
||||
self.file_handler = AddonFileHandler(self.log_file_path, self)
|
||||
|
||||
def initialize_logger(self, module_name: Optional[str] = None,
|
||||
*,
|
||||
log_lvl: Optional[int] = None,
|
||||
log_stream: Optional[StringIO] = None,
|
||||
base_name: str = "Addon",
|
||||
append_dcc_handlers: bool = True,
|
||||
have_filehandler: bool = True
|
||||
) -> logging.Logger:
|
||||
"""Set format, log level and returns a logger instance
|
||||
|
||||
Args:
|
||||
module_name: The name of the module a required argument
|
||||
Env log_lvl variable name is derived as follows:
|
||||
Logger name: Addon => log_lvl
|
||||
Logger name: Addon.DL => log_lvl_dl
|
||||
Logger name: Addon.P4C.UI => log_lvl_p4c_ui
|
||||
But also:
|
||||
Logger name: bonnie => log_lvl
|
||||
Logger name: clyde.whatever => log_lvl_whatever
|
||||
log_lvl: Integer specifying which logs to be printed, one of:
|
||||
https://docs.python.org/3/library/logging.html#levels
|
||||
log_stream: Output to StringIO stream instead of the console if not None
|
||||
base_name: By default all loggers get derived from logger "Addon".
|
||||
append_dcc_handlers: Defines if the handlers cached in self.dcc_handlers
|
||||
should be added for the new logger.
|
||||
have_filehandler: Set to fault too disable logging to a file.
|
||||
|
||||
Returns:
|
||||
Returns a reference to the initialized logger instance
|
||||
|
||||
Raises:
|
||||
AttributeError: If log_lvl and env are both None.
|
||||
"""
|
||||
|
||||
if module_name is None:
|
||||
logger_name = f"{base_name}"
|
||||
name_hierarchy = []
|
||||
else:
|
||||
logger_name = f"{base_name}.{module_name}"
|
||||
name_hierarchy = module_name.split(".")
|
||||
|
||||
if log_lvl is None:
|
||||
log_lvl_name = "log_lvl"
|
||||
for name in name_hierarchy:
|
||||
log_lvl_name += f"_{name.lower()}"
|
||||
|
||||
try:
|
||||
log_lvl = self.addon_env.config.getint(
|
||||
"DEFAULT", log_lvl_name, fallback=NOT_SET)
|
||||
except AttributeError:
|
||||
log_lvl = NOT_SET # no .ini
|
||||
|
||||
logger = logging.getLogger(logger_name)
|
||||
logger.propagate = False
|
||||
logger.setLevel(log_lvl)
|
||||
|
||||
if self.stream_handler is None:
|
||||
self.stream_handler = logging.StreamHandler(log_stream)
|
||||
self.set_dcc_handlers(handlers=[self.stream_handler])
|
||||
|
||||
self._init_filehandler(have_filehandler)
|
||||
if self.file_handler is not None:
|
||||
self.set_dcc_handlers(handlers=[self.file_handler])
|
||||
|
||||
if append_dcc_handlers and len(self.dcc_handlers) > 1:
|
||||
self.set_dcc_handlers(loggers=[logger])
|
||||
|
||||
self.loggers.append(logger)
|
||||
|
||||
return logger
|
||||
|
||||
def set_write_to_file_handler(
|
||||
self, enabled: bool, path: Optional[str] = None) -> None:
|
||||
self.write_to_file = enabled
|
||||
if path is not None:
|
||||
self.log_file_path = path
|
||||
|
||||
def set_dcc_handlers(
|
||||
self,
|
||||
loggers: Optional[List[logging.Logger]] = None,
|
||||
handlers: Optional[List[logging.Handler]] = None) -> None:
|
||||
|
||||
loggers_to_add = loggers if loggers is not None else self.loggers
|
||||
handlers_to_add = handlers if handlers is not None else self.dcc_handlers
|
||||
for _logger in loggers_to_add:
|
||||
for _handler in handlers_to_add:
|
||||
self.set_handlers_formatter([_handler])
|
||||
_logger.addHandler(_handler)
|
||||
if _handler not in self.dcc_handlers:
|
||||
self.dcc_handlers.append(_handler)
|
||||
|
||||
def set_handlers_formatter(
|
||||
self,
|
||||
handlers: Optional[List[logging.Handler]] = None) -> None:
|
||||
handlers_to_format = handlers if handlers is not None else self.dcc_handlers
|
||||
formatter = logging.Formatter(
|
||||
fmt=self.str_format, datefmt=self.date_format)
|
||||
for _handler in handlers_to_format:
|
||||
_handler.setFormatter(formatter)
|
||||
|
||||
|
||||
class AddonFileHandler(logging.FileHandler):
|
||||
def __init__(self, filepath: str, log_manager: AddonLogger):
|
||||
super(AddonFileHandler, self).__init__(filepath)
|
||||
|
||||
self.log_manager = log_manager
|
||||
self.original_emit_event = self.emit
|
||||
self.emit = self.custom_emit
|
||||
|
||||
def change_log_filename(self, filename: str):
|
||||
if os.path.isfile(filename):
|
||||
self.baseFilename = filename
|
||||
|
||||
def custom_emit(self, record: logging.LogRecord) -> None:
|
||||
if not self.log_manager.write_to_file:
|
||||
return
|
||||
|
||||
if self.log_manager.log_file_path != self.baseFilename:
|
||||
self.baseFilename = self.log_manager.log_file_path
|
||||
self.close()
|
||||
self.stream = None
|
||||
|
||||
# Ensure original emit event runs
|
||||
self.original_emit_event(record)
|
||||
|
||||
|
||||
def get_addon_logger(env: Optional[PoliigonEnvironment] = None) -> AddonLogger:
|
||||
global addon_logger
|
||||
if addon_logger is None:
|
||||
addon_logger = AddonLogger(env)
|
||||
return addon_logger
|
||||
@@ -0,0 +1,219 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from enum import IntEnum
|
||||
from dataclasses import dataclass
|
||||
from typing import Dict, Optional
|
||||
|
||||
from .multilingual import _t
|
||||
|
||||
|
||||
# MAPS_TYPE_NAMES defined at the end of the file (needs MapType defined)
|
||||
|
||||
|
||||
@dataclass()
|
||||
class MapDescription:
|
||||
description: str
|
||||
display_name: str
|
||||
|
||||
|
||||
class MapType(IntEnum):
|
||||
"""Supported texture map types.
|
||||
|
||||
NOTE: When extending, existing values MUST NEVER be changed.
|
||||
NOTE 2: Derived from IntEnum for easier "to JSON serialization"
|
||||
"""
|
||||
|
||||
# Convention 0 values
|
||||
DEFAULT = 1
|
||||
UNKNOWN = 1
|
||||
|
||||
ALPHA = 2 # Usually associated with a brush
|
||||
ALPHAMASKED = 3
|
||||
AO = 4
|
||||
BUMP = 5
|
||||
BUMP16 = 6
|
||||
COL = 7
|
||||
DIFF = 8
|
||||
DISP = 9
|
||||
DISP16 = 10
|
||||
EMISSIVE = 11
|
||||
EMISSION = 11
|
||||
ENV = 12 # Environment for an HDRI, typically a .jpg file
|
||||
JPG = 12 # Environment for an HDRI, type_code as in ApiResponse
|
||||
FUZZ = 13
|
||||
GLOSS = 14
|
||||
IDMAP = 15
|
||||
LIGHT = 16 # Lighting for an HDRI, typically a .exr file
|
||||
HDR = 16 # Lighting for an HDRI, type_code as in ApiResponse
|
||||
MASK = 17 # Mask here means opacity
|
||||
METALNESS = 18
|
||||
NRM = 19
|
||||
NRM16 = 20
|
||||
OVERLAY = 21
|
||||
REFL = 22
|
||||
ROUGHNESS = 23
|
||||
SSS = 24
|
||||
TRANSLUCENCY = 25
|
||||
TRANSMISSION = 26
|
||||
OPACITY = 27
|
||||
UNDEF = 28
|
||||
# Non convention 0 types (needed for convention conversion)
|
||||
NA_ORM = 50
|
||||
NA_VERTEXBLEND = 51
|
||||
|
||||
# Convention 1, values 100 to 149 (150 and up for convention 1 only maps)
|
||||
# NOTE: Value - 100 should match convention 0
|
||||
AmbientOcclusion = 104
|
||||
BaseColor = 107
|
||||
BaseColorOpacity = 103 # realtime only
|
||||
BaseColorVertexBlend = 151
|
||||
Displacement = 109
|
||||
Emission = 111
|
||||
Environment = 112 # Map for converting HDRI into Env
|
||||
HDRI = 116
|
||||
ORM = 150 # packmap where R:AO, G:Roughness, B:Metalness, realtime only
|
||||
Metallic = 118
|
||||
Normal = 119
|
||||
Opacity = 117
|
||||
Roughness = 123
|
||||
ScatteringColor = 124
|
||||
SheenColor = 113
|
||||
Translucency = 125
|
||||
Transmission = 126
|
||||
|
||||
@classmethod
|
||||
def from_type_code(cls, map_type_code: str):
|
||||
if map_type_code in MAPS_TYPE_NAMES:
|
||||
return cls[map_type_code]
|
||||
|
||||
map_type_code = map_type_code.split("_")[0]
|
||||
if map_type_code in MAPS_TYPE_NAMES:
|
||||
return cls[map_type_code]
|
||||
|
||||
return cls.UNKNOWN
|
||||
|
||||
def get_convention(self) -> int:
|
||||
return self.value // 100
|
||||
|
||||
def get_effective(self): # -> MapType
|
||||
return self.convert_convention(0)
|
||||
|
||||
def convert_convention(self, target_convention: int): # -> MapType
|
||||
convention_in = self.value // 100
|
||||
convention_diff = target_convention - convention_in
|
||||
return MapType(self.value + (100 * convention_diff))
|
||||
|
||||
def get_description(self) -> Optional[MapDescription]:
|
||||
return MAP_DESCRIPTIONS.get(self.get_effective(), None)
|
||||
|
||||
|
||||
MAP_DESCRIPTIONS: Dict = {
|
||||
MapType.ALPHAMASKED: MapDescription(
|
||||
description=_t(
|
||||
"This texture map is identical to the Base Color Map, but with "
|
||||
"an added Alpha channel containing the opacity map. This is "
|
||||
"included in materials containing empty see-through space such "
|
||||
"as sheer fabrics and leaves."),
|
||||
display_name=_t("Base Color Opacity")),
|
||||
|
||||
MapType.AO: MapDescription(
|
||||
description=_t(
|
||||
"Defines the shadows in the crevices of the material. It's combined "
|
||||
"with the color map by using a Multiply layer blend operation."),
|
||||
display_name=_t("Ambient Occlusion")),
|
||||
|
||||
MapType.COL: MapDescription(
|
||||
description=_t(
|
||||
"Contains the pure color information of the surface, "
|
||||
"devoid of any shadow or reflection."),
|
||||
display_name=_t("Base Color")),
|
||||
|
||||
MapType.DISP: MapDescription(
|
||||
description=_t(
|
||||
"This black and white image defines the height information of the "
|
||||
"surface. Light values are raised, dark values are reduced, "
|
||||
"mid-grey (0.5) represents the flat mid-point of the surface."),
|
||||
display_name=_t("Displacement")),
|
||||
|
||||
MapType.FUZZ: MapDescription(
|
||||
description=_t(
|
||||
"Defines the fine fuzz of microfibers in cloth-like surfaces. "
|
||||
"Included with many fabrics textures. "
|
||||
"The sheen color defines only the color."),
|
||||
display_name=_t("Sheen Color")),
|
||||
|
||||
MapType.METALNESS: MapDescription(
|
||||
description=_t(
|
||||
"This black and white image defines which parts are metal "
|
||||
"(white) and which are non-metal (black)."),
|
||||
display_name=_t("Metallic")),
|
||||
|
||||
MapType.NRM: MapDescription(
|
||||
description=_t(
|
||||
"This purple-ish image defines the height information, which is faked "
|
||||
"by shader (not physically altering the mesh)"),
|
||||
display_name=_t("Normal")),
|
||||
|
||||
MapType.ROUGHNESS: MapDescription(
|
||||
description=_t(
|
||||
"This black and white image defines how sharp or diffuse the "
|
||||
"reflections are. Blacker values are glossy, "
|
||||
"whiter values are matte."),
|
||||
display_name=_t("Roughness")),
|
||||
|
||||
MapType.SSS: MapDescription(
|
||||
description=_t(
|
||||
"Defines the color of light passing through solid closed manifold "
|
||||
"objects like food or fabric. This is included in fabric "
|
||||
"and vegetation textures."),
|
||||
display_name=_t("Scattering Color")),
|
||||
|
||||
MapType.TRANSLUCENCY: MapDescription(
|
||||
description=_t(
|
||||
"Defines the color of light penetrating and appearing on the "
|
||||
"backside of a flat thinshell meshes. "
|
||||
"This is included in fabric and vegetation textures."),
|
||||
display_name=_t("Translucency")),
|
||||
|
||||
MapType.TRANSMISSION: MapDescription(
|
||||
description=_t(
|
||||
"Defines which parts of the texture are refracting light, "
|
||||
"and is included in textures like glass or liquids. "
|
||||
"The IOR (Index of Refraction) should be set be defined "
|
||||
"by you depending on the material."),
|
||||
display_name=_t("Transmission")),
|
||||
|
||||
MapType.MASK: MapDescription(
|
||||
description=_t(
|
||||
"Defines which parts of the texture are opaque, or transparent "
|
||||
"(completely invisible, without refraction). "
|
||||
"This is included in materials containing empty see-through "
|
||||
"space such as sheer fabrics and leaves."),
|
||||
display_name=_t("Opacity")),
|
||||
|
||||
MapType.NA_ORM: MapDescription(
|
||||
description=_t(
|
||||
"This special texture stores the same Ambient Occlusion, Roughness "
|
||||
"and Metalness information, but each are stored in the separate Red, "
|
||||
"Green and Blue channels respectively. This special map is "
|
||||
"typically only used in realtime rendering and game applications."),
|
||||
display_name=_t("ORM"))
|
||||
}
|
||||
|
||||
MAPS_TYPE_NAMES = MapType.__members__
|
||||
@@ -0,0 +1,82 @@
|
||||
from typing import List, Callable
|
||||
import gettext
|
||||
|
||||
|
||||
def _m(message: str) -> str:
|
||||
"""Placeholder to mark strings to be translated and stored on .pot file"""
|
||||
return message
|
||||
|
||||
|
||||
def _o(message: str) -> str:
|
||||
"""Gets the original string, e.g after set a translated variable"""
|
||||
return gettext.dgettext("en-US", message)
|
||||
|
||||
|
||||
def _t(message: str) -> str:
|
||||
"""Gets the translated string with the current domain setup"""
|
||||
try:
|
||||
return gettext.gettext(message) # noqa
|
||||
except NameError:
|
||||
# If none domain is initialized, _() will not be defined yet;
|
||||
return message
|
||||
|
||||
|
||||
class MsgFallback(gettext.NullTranslations):
|
||||
"""Fallback to report if one is trying to translate a message that is not
|
||||
registered on the stored domains (platforms and languages);
|
||||
"""
|
||||
|
||||
def __init__(self, fallback_method: Callable = None) -> None:
|
||||
super().__init__()
|
||||
self.fallback = fallback_method
|
||||
|
||||
def gettext(self, msg) -> str:
|
||||
self.fallback(msg)
|
||||
return msg
|
||||
|
||||
|
||||
class Multilingual:
|
||||
"""Class to store and manage all the domains for multilingual translation."""
|
||||
|
||||
report_message: Callable # Report function to be set on the addon;
|
||||
curr_language: str
|
||||
supported_languages = ["en-US", "test_dummy"]
|
||||
|
||||
# All domains already registered.
|
||||
# NOTE: Do not change this from outside this module;
|
||||
_domains: List[gettext.GNUTranslations]
|
||||
|
||||
def __init__(self):
|
||||
self._domains = []
|
||||
self.report_message = None
|
||||
self.curr_language = None
|
||||
|
||||
def install_domain(self,
|
||||
language: str,
|
||||
dir_lang: str,
|
||||
domain: str = "addon-core") -> None:
|
||||
|
||||
if language not in self.supported_languages:
|
||||
return
|
||||
|
||||
current_domain = gettext.translation(domain,
|
||||
localedir=dir_lang,
|
||||
languages=[language, "en-US"])
|
||||
current_domain.install()
|
||||
|
||||
# If there are already installed domains, they are used as fallback if
|
||||
# a given message is not found. Each new domain will fall back to the
|
||||
# previous one - until the first added domain, which will call
|
||||
# report_message_missing;
|
||||
if len(self._domains) > 0:
|
||||
current_domain.add_fallback(self._domains[-1])
|
||||
else:
|
||||
current_domain.add_fallback(MsgFallback(self.report_message_missing))
|
||||
|
||||
self.curr_language = language
|
||||
self._domains.append(current_domain)
|
||||
|
||||
def report_message_missing(self, msg: str) -> None:
|
||||
if self.report_message is not None and self.curr_language != "en-US":
|
||||
error_msg = f"{self.curr_language}:\"{msg}\""
|
||||
self.report_message("missing_locale_msg", error_msg, "error")
|
||||
@@ -0,0 +1,678 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
"""Module fo asynchronous user notificatiions."""
|
||||
|
||||
from dataclasses import dataclass, field
|
||||
from functools import wraps
|
||||
from enum import IntEnum
|
||||
from queue import Queue
|
||||
from threading import Lock
|
||||
from typing import Callable, Dict, List, Optional, Any
|
||||
|
||||
from .thread_manager import PoolKeys
|
||||
from .multilingual import _m
|
||||
|
||||
# Predefined priority values (lower numbers -> higher prio)
|
||||
NOTICE_PRIO_LOWEST = 200
|
||||
NOTICE_PRIO_LOW = 100
|
||||
NOTICE_PRIO_MEDIUM = 50
|
||||
NOTICE_PRIO_HIGH = 20
|
||||
NOTICE_PRIO_URGENT = 1
|
||||
|
||||
NOTICE_PRIO_MAT_TEMPLATE = NOTICE_PRIO_HIGH
|
||||
NOTICE_PRIO_NO_INET = NOTICE_PRIO_LOW # Show, but other errors have precedent
|
||||
NOTICE_PRIO_PROXY = NOTICE_PRIO_MEDIUM
|
||||
NOTICE_PRIO_SETTINGS_WRITE = NOTICE_PRIO_HIGH
|
||||
NOTICE_PRIO_SURVEY = NOTICE_PRIO_LOWEST
|
||||
NOTICE_PRIO_UPDATE = NOTICE_PRIO_HIGH + 5 # urgent, but room for "more urgent"
|
||||
NOTICE_PRIO_RESTART = NOTICE_PRIO_LOW
|
||||
|
||||
# Predefined notice IDs
|
||||
NOTICE_ID_MAT_TEMPLATE = "MATERIAL_TEMPLATE_ERROR"
|
||||
NOTICE_ID_NO_INET = "NO_INTERNET_CONNECTION"
|
||||
NOTICE_ID_PROXY = "PROXY_CONNECTION_ERROR"
|
||||
NOTICE_ID_SETTINGS_WRITE = "SETTINGS_WRITE_ERROR"
|
||||
NOTICE_ID_SURVEY_FREE = "NPS_INAPP_FREE"
|
||||
NOTICE_ID_SURVEY_ACTIVE = "NPS_INAPP_ACTIVE"
|
||||
NOTICE_ID_UPDATE = "UPDATE_READY_MANUAL_INSTALL"
|
||||
NOTICE_ID_VERSION_ALERT = "ADDON_VERSION_ALERT"
|
||||
NOTICE_ID_RESTART_ALERT = "NOTICE_ID_RESTART_ALERT"
|
||||
|
||||
# Predefined notice titles
|
||||
# Used a default param in create functions, but should usually be overridden
|
||||
# by passing in localized titles from DCC.
|
||||
NOTICE_TITLE_MAT_TEMPLATE = _m("Material template error")
|
||||
NOTICE_TITLE_NO_INET = _m("No internet access")
|
||||
NOTICE_TITLE_PROXY = _m("Encountered proxy error")
|
||||
NOTICE_TITLE_SETTINGS_WRITE = _m("Failed to write settings")
|
||||
NOTICE_TITLE_SURVEY = _m("How's the addon?")
|
||||
NOTICE_TITLE_UPDATE = _m("Update ready")
|
||||
NOTICE_TITLE_DEPRECATED = _m("Deprecated version")
|
||||
NOTICE_TITLE_RESTART = _m("Restart needed")
|
||||
|
||||
# Predefined notice Labels (text to be displayed on the notification Banner)
|
||||
# Used a default param in create functions, but should usually be overridden
|
||||
# by passing in localized titles from DCC.
|
||||
NOTICE_LABEL_NO_INET = _m("Connection Lost")
|
||||
NOTICE_LABEL_PROXY_ERROR = _m("Proxy Error")
|
||||
NOTICE_LABEL_RESTART = _m("Restart needed")
|
||||
|
||||
# Predefined notice Body (text to be displayed on the notification Popup)
|
||||
# Used a default param in create functions, but should usually be overridden
|
||||
# by passing in localized titles from DCC.
|
||||
NOTICE_BODY_NO_INET = _m("Cannot reach Poliigon, double check your "
|
||||
"firewall is configured to access Poliigon servers: "
|
||||
"*poliigon.com / *poliigon.net / *imagedelivery.net. "
|
||||
"If this persists, please reach out to support.")
|
||||
NOTICE_BODY_RESTART = _m("Please restart your 3D software")
|
||||
|
||||
# Predefined icons, assign DCC specific key/reference via init_icons()
|
||||
NOTICE_ICON_WARN = "ICON_WARN"
|
||||
NOTICE_ICON_INFO = "ICON_INFO"
|
||||
NOTICE_ICON_SURVEY = "ICON_SURVEY"
|
||||
NOTICE_ICON_NO_CONNECTION = "ICON_NO_CONNECTION"
|
||||
|
||||
|
||||
class ActionType(IntEnum):
|
||||
# Note: Numerical values are still same as in P4B, but entries got
|
||||
# sorted alphabetically
|
||||
OPEN_URL = 1
|
||||
POPUP_MESSAGE = 3
|
||||
RUN_OPERATOR = 4
|
||||
UPDATE_READY = 2
|
||||
|
||||
|
||||
class SignalType(IntEnum):
|
||||
"""Types of each interaction with the notifications."""
|
||||
|
||||
VIEWED = 0
|
||||
DISMISSED = 1
|
||||
CLICKED = 2
|
||||
|
||||
|
||||
@dataclass
|
||||
class Notification():
|
||||
"""Container object for a user notification.
|
||||
|
||||
NOTE: Do not instance Notification directly, but instead either use
|
||||
NotificationSystem.create_... functions or instance derived
|
||||
NotificationXYZ classes.
|
||||
"""
|
||||
|
||||
# Unique id for this specific kind of notice, if possible re-use above
|
||||
# NOTICE_ID_xyz.
|
||||
id_notice: str
|
||||
# Main title, should be short
|
||||
title: str
|
||||
# Indicator of how to structure and draw notification.
|
||||
action: ActionType = field(init=False) # does NOT get auto initialized
|
||||
# Priority is always > 0, lower values = higher priority
|
||||
priority: int
|
||||
# Label to be shown in the notification banner - to be defined per addon
|
||||
label: str = ""
|
||||
# Allow the user to dismiss the notification.
|
||||
allow_dismiss: bool = True
|
||||
# Dismiss after user interacted with the notification
|
||||
auto_dismiss: bool = False
|
||||
# Hover-over tooltip, if there is a button
|
||||
tooltip: str = ""
|
||||
# In Blender icon's are referenced in strings (icon enum),
|
||||
# but this may differ per DCC. For prebuilt notices init_icons() has
|
||||
# to be used to store DCC dependent icons, once.
|
||||
icon: Optional[any] = None
|
||||
# Defines if notification has to open a popup with more information and
|
||||
# options for the user to then address or dismiss the notice
|
||||
open_popup: bool = False
|
||||
# Text for the button to execute notify callable when it opens a popup
|
||||
action_string: Optional[str] = None
|
||||
# action callable to be attached to the notification, to be executed
|
||||
# when notification is clicked
|
||||
action_callable: Optional[Callable] = None
|
||||
# function to be called when the notification is dismissed (viewed or not)
|
||||
on_dismiss_callable: Optional[Callable] = None
|
||||
|
||||
viewed: bool = False # False until actually drawn
|
||||
clicked: bool = False # False until user interact with the notice
|
||||
|
||||
|
||||
@dataclass
|
||||
class AddonNotificationsParameters:
|
||||
"""Parameters to be parsed from the addon.
|
||||
|
||||
parameters:
|
||||
update_callable: Callable to be set as action_callable to update notifications
|
||||
update_action_text: Action text for updates - used as popup update button text
|
||||
update_body: Text with a description for update - used as popup text
|
||||
|
||||
NOTE: Feel free to add here any other parameter needed from the addon.
|
||||
"""
|
||||
|
||||
update_callable: Optional[Callable] = None
|
||||
update_action_text: str = NOTICE_TITLE_UPDATE
|
||||
update_body: str = ""
|
||||
|
||||
|
||||
@dataclass
|
||||
class NotificationOpenUrl(Notification):
|
||||
url: str = ""
|
||||
|
||||
def __post_init__(self):
|
||||
self.action = ActionType.OPEN_URL
|
||||
|
||||
def get_key(self) -> str:
|
||||
return "".join([self.action.name, self.url, self.label])
|
||||
|
||||
|
||||
@dataclass
|
||||
class NotificationPopup(Notification):
|
||||
body: str = ""
|
||||
url: str = ""
|
||||
alert: bool = True
|
||||
|
||||
def __post_init__(self):
|
||||
self.action = ActionType.POPUP_MESSAGE
|
||||
|
||||
def get_key(self) -> str:
|
||||
return "".join([self.action.name, self.url, self.body])
|
||||
|
||||
|
||||
@dataclass
|
||||
class NotificationRunOperator(Notification):
|
||||
# For Blender ops_name will be string, for C4D not so sure, yet...
|
||||
# I guess, we could even store a callable in here.
|
||||
ops_name: Optional[any] = None
|
||||
|
||||
def __post_init__(self):
|
||||
self.action = ActionType.RUN_OPERATOR
|
||||
|
||||
def get_key(self) -> str:
|
||||
return "".join([self.action.name, self.ops_name])
|
||||
|
||||
|
||||
@dataclass
|
||||
class NotificationUpdateReady(Notification):
|
||||
download_url: str = ""
|
||||
download_label: str = ""
|
||||
logs_url: str = ""
|
||||
logs_label: str = ""
|
||||
body: str = ""
|
||||
|
||||
def __post_init__(self):
|
||||
self.action = ActionType.UPDATE_READY
|
||||
|
||||
def get_key(self) -> str:
|
||||
return "".join([self.action.name, self.download_url, self.download_label])
|
||||
|
||||
|
||||
class NotificationSystem():
|
||||
"""Abstraction to handle asynchronous user notification.
|
||||
|
||||
Each DCC has to populate icon_dcc_map.
|
||||
"""
|
||||
|
||||
_api = None # PoliigonConector
|
||||
_tm = None # Thread Manager
|
||||
|
||||
_queue_notice: Queue = Queue()
|
||||
_lock_notice: Lock = Lock()
|
||||
_notices: Dict = {} # {key: Notification}
|
||||
|
||||
# Each DCC use init_icons() to populate these values as is fitting for
|
||||
# themselves.
|
||||
icon_dcc_map: Dict[str, Optional[any]] = {
|
||||
NOTICE_ICON_WARN: None,
|
||||
NOTICE_ICON_INFO: None,
|
||||
NOTICE_ICON_SURVEY: None,
|
||||
NOTICE_ICON_NO_CONNECTION: None
|
||||
}
|
||||
|
||||
addon_params = AddonNotificationsParameters()
|
||||
|
||||
def __init__(self, addon):
|
||||
if addon is None:
|
||||
return
|
||||
self._api = addon._api
|
||||
self._tm = addon._tm
|
||||
|
||||
def init_icons(
|
||||
self,
|
||||
icon_warn: Optional[Any] = None,
|
||||
icon_info: Optional[Any] = None,
|
||||
icon_survey: Optional[Any] = None,
|
||||
icon_no_connection: Optional[Any] = None
|
||||
) -> None:
|
||||
self.icon_dcc_map[NOTICE_ICON_WARN] = icon_warn
|
||||
self.icon_dcc_map[NOTICE_ICON_INFO] = icon_info
|
||||
self.icon_dcc_map[NOTICE_ICON_SURVEY] = icon_survey
|
||||
self.icon_dcc_map[NOTICE_ICON_NO_CONNECTION] = icon_no_connection
|
||||
|
||||
def _run_threaded(key_pool: PoolKeys,
|
||||
max_threads: Optional[int] = None,
|
||||
foreground: bool = False) -> Callable:
|
||||
"""Schedules a function to run in a thread of a chosen pool."""
|
||||
|
||||
def wrapped_func(func: Callable) -> Callable:
|
||||
@wraps(func)
|
||||
def wrapped_func_call(self, *args, **kwargs):
|
||||
args = (self, ) + args
|
||||
return self._tm.queue_thread(func,
|
||||
key_pool,
|
||||
max_threads,
|
||||
foreground,
|
||||
*args,
|
||||
**kwargs)
|
||||
return wrapped_func_call
|
||||
return wrapped_func
|
||||
|
||||
def _consume_queued_notices(self) -> None:
|
||||
"""Empties the notice queue and stores all new notices in _notices.
|
||||
|
||||
Note: If an identical notice already exists, it will get skipped.
|
||||
"""
|
||||
|
||||
with self._lock_notice:
|
||||
while self._queue_notice.qsize() > 0:
|
||||
notice = self._queue_notice.get(block=False)
|
||||
key = notice.get_key()
|
||||
if key in self._notices:
|
||||
continue
|
||||
self._notices[key] = notice
|
||||
|
||||
def _get_sorted_notices(self) -> List[Notification]:
|
||||
"""Returns a priority sorted list with all notices."""
|
||||
|
||||
with self._lock_notice:
|
||||
all_notices = list(self._notices.values())
|
||||
all_notices.sort(key=lambda notice: notice.priority)
|
||||
return all_notices
|
||||
|
||||
@_run_threaded(PoolKeys.INTERACTIVE)
|
||||
def _thread_signal(
|
||||
self, notice: Notification, signal_type: SignalType) -> None:
|
||||
"""Asynchronously signals "notice got viewed" to server"."""
|
||||
|
||||
if signal_type == SignalType.VIEWED:
|
||||
self._api.signal_view_notification(notice.id_notice)
|
||||
elif signal_type == SignalType.DISMISSED:
|
||||
self._api.signal_dismiss_notification(notice.id_notice)
|
||||
elif signal_type == SignalType.CLICKED:
|
||||
self._api.signal_click_notification(notice.id_notice, notice.action)
|
||||
|
||||
def _signal_view(self, notice: Notification) -> None:
|
||||
"""Internally used to start the signal view thread."""
|
||||
|
||||
if self._api is None or not self._api._is_opted_in():
|
||||
return
|
||||
self._thread_signal(notice, SignalType.VIEWED)
|
||||
|
||||
def _signal_clicked(self, notice: Notification) -> None:
|
||||
"""Internally used to start the signal click thread."""
|
||||
|
||||
if self._api is None or not self._api._is_opted_in():
|
||||
return
|
||||
self._thread_signal(notice, SignalType.CLICKED)
|
||||
|
||||
def _signal_dismiss(self, notice: Notification) -> None:
|
||||
"""Internally used to start the signal dismiss thread."""
|
||||
|
||||
if self._api is None or not self._api._is_opted_in():
|
||||
return
|
||||
self._thread_signal(notice, SignalType.DISMISSED)
|
||||
|
||||
def enqueue_notice(self, notice: Notification) -> None:
|
||||
"""Enqueues a new notification."""
|
||||
|
||||
self._queue_notice.put(notice)
|
||||
|
||||
def dismiss_notice(
|
||||
self, notice: Notification, force: bool = False) -> None:
|
||||
"""Dismisses a notice.
|
||||
|
||||
Use force parameter to dismiss 'un-dismissable' notices, e.g.
|
||||
a 'no internet' notice, when internet is back on.
|
||||
"""
|
||||
|
||||
if not notice.allow_dismiss and not force:
|
||||
return
|
||||
|
||||
if not notice.clicked:
|
||||
self._signal_dismiss(notice)
|
||||
|
||||
if notice.on_dismiss_callable is not None:
|
||||
notice.on_dismiss_callable()
|
||||
|
||||
key = notice.get_key()
|
||||
with self._lock_notice:
|
||||
if key in self._notices:
|
||||
del self._notices[key]
|
||||
|
||||
def clicked_notice(self, notice: Notification) -> None:
|
||||
"""To be called, when a user interacted with the notice."""
|
||||
|
||||
notice.clicked = True
|
||||
self._signal_clicked(notice)
|
||||
|
||||
if notice.action_callable is not None:
|
||||
notice.action_callable()
|
||||
|
||||
if not notice.auto_dismiss:
|
||||
return
|
||||
self.dismiss_notice(notice)
|
||||
|
||||
def get_all_notices(self) -> List[Notification]:
|
||||
"""Returns a priority sorted list with all notices.
|
||||
|
||||
Usually called from draw code.
|
||||
"""
|
||||
|
||||
self._consume_queued_notices()
|
||||
return self._get_sorted_notices()
|
||||
|
||||
def get_top_notice(
|
||||
self, do_signal_view: bool = False) -> Optional[Notification]:
|
||||
"""Returns current highest priority notice.
|
||||
|
||||
Usually called from draw code.
|
||||
"""
|
||||
|
||||
notices_by_prio = self.get_all_notices()
|
||||
try:
|
||||
notice = notices_by_prio[0]
|
||||
if do_signal_view and not notice.viewed:
|
||||
self._signal_view(notice)
|
||||
notice.viewed = True
|
||||
except (KeyError, IndexError):
|
||||
notice = None
|
||||
|
||||
return notice
|
||||
|
||||
def notification_popup(
|
||||
self, notice: Notification, do_signal_view: bool = False) -> None:
|
||||
"""Called when a popup notification is drawn"""
|
||||
|
||||
if do_signal_view and not notice.viewed:
|
||||
self._signal_view(notice)
|
||||
notice.viewed = True
|
||||
|
||||
def flush_all(self) -> None:
|
||||
"""Flushes all existing notices."""
|
||||
|
||||
while not self._queue_notice.empty():
|
||||
self._queue_notice.get(block=False)
|
||||
|
||||
with self._lock_notice:
|
||||
self._notices = {}
|
||||
|
||||
def create_restart_needed(self,
|
||||
title: str = NOTICE_TITLE_RESTART,
|
||||
*,
|
||||
label: str = NOTICE_LABEL_RESTART,
|
||||
tooltip: str = "",
|
||||
body: str = NOTICE_BODY_RESTART,
|
||||
action_string: Optional[str] = None,
|
||||
auto_enqueue: bool = True
|
||||
) -> Notification:
|
||||
"""Returns a pre-built 'Restart Needed' notice."""
|
||||
|
||||
notice = NotificationPopup(
|
||||
id_notice=NOTICE_ID_RESTART_ALERT,
|
||||
title=title,
|
||||
label=label,
|
||||
priority=NOTICE_PRIO_RESTART,
|
||||
allow_dismiss=False,
|
||||
open_popup=True,
|
||||
action_string=action_string,
|
||||
tooltip=tooltip,
|
||||
icon=self.icon_dcc_map[NOTICE_ICON_WARN],
|
||||
body=body
|
||||
)
|
||||
if auto_enqueue:
|
||||
self.enqueue_notice(notice)
|
||||
return notice
|
||||
|
||||
def create_no_internet(self,
|
||||
title: str = NOTICE_TITLE_NO_INET,
|
||||
*,
|
||||
label: str = NOTICE_LABEL_NO_INET,
|
||||
tooltip: str = "",
|
||||
body: str = NOTICE_BODY_NO_INET,
|
||||
auto_enqueue: bool = True
|
||||
) -> Notification:
|
||||
"""Returns a pre-built 'No internet' notice."""
|
||||
|
||||
notice = NotificationPopup(
|
||||
id_notice=NOTICE_ID_NO_INET,
|
||||
title=title,
|
||||
label=label,
|
||||
priority=NOTICE_PRIO_NO_INET,
|
||||
allow_dismiss=False,
|
||||
open_popup=True,
|
||||
action_string=None,
|
||||
tooltip=tooltip,
|
||||
icon=self.icon_dcc_map[NOTICE_ICON_NO_CONNECTION],
|
||||
body=body
|
||||
)
|
||||
if auto_enqueue:
|
||||
self.enqueue_notice(notice)
|
||||
return notice
|
||||
|
||||
def create_proxy(self,
|
||||
title: str = NOTICE_TITLE_PROXY,
|
||||
*,
|
||||
label: str = NOTICE_LABEL_PROXY_ERROR,
|
||||
tooltip: str = "",
|
||||
body: str = NOTICE_BODY_NO_INET,
|
||||
auto_enqueue: bool = True
|
||||
) -> Notification:
|
||||
"""Returns a pre-built 'Proxy error' notice."""
|
||||
|
||||
notice = NotificationPopup(
|
||||
id_notice=NOTICE_ID_PROXY,
|
||||
title=title,
|
||||
label=label,
|
||||
priority=NOTICE_PRIO_PROXY,
|
||||
allow_dismiss=False,
|
||||
open_popup=True,
|
||||
action_string=None,
|
||||
tooltip=tooltip,
|
||||
icon=self.icon_dcc_map[NOTICE_ICON_WARN],
|
||||
body=body
|
||||
)
|
||||
if auto_enqueue:
|
||||
self.enqueue_notice(notice)
|
||||
return notice
|
||||
|
||||
def create_survey(self,
|
||||
title: str = NOTICE_TITLE_SURVEY,
|
||||
*,
|
||||
is_free_user: bool,
|
||||
tooltip: str,
|
||||
free_survey_url: str,
|
||||
active_survey_url: str,
|
||||
label: str,
|
||||
auto_enqueue: bool = True,
|
||||
on_dismiss_callable: Optional[Callable] = None
|
||||
) -> Notification:
|
||||
"""Returns a pre-built 'user survey' notice."""
|
||||
|
||||
if is_free_user:
|
||||
id_notice = NOTICE_ID_SURVEY_FREE
|
||||
url = free_survey_url
|
||||
else:
|
||||
id_notice = NOTICE_ID_SURVEY_ACTIVE
|
||||
url = active_survey_url
|
||||
notice = NotificationOpenUrl(
|
||||
id_notice=id_notice,
|
||||
title=title,
|
||||
priority=NOTICE_PRIO_SURVEY,
|
||||
allow_dismiss=True,
|
||||
auto_dismiss=True,
|
||||
tooltip=tooltip,
|
||||
url=url,
|
||||
label=label,
|
||||
icon=self.icon_dcc_map[NOTICE_ICON_SURVEY],
|
||||
on_dismiss_callable=on_dismiss_callable
|
||||
)
|
||||
if auto_enqueue:
|
||||
self.enqueue_notice(notice)
|
||||
return notice
|
||||
|
||||
def create_write_mat_template(self,
|
||||
title: str = NOTICE_TITLE_MAT_TEMPLATE,
|
||||
*,
|
||||
tooltip: str,
|
||||
body: str,
|
||||
auto_enqueue: bool = True
|
||||
) -> Notification:
|
||||
"""Returns a pre-built 'Material template error' notice."""
|
||||
|
||||
notice = NotificationPopup(
|
||||
id_notice=NOTICE_ID_MAT_TEMPLATE,
|
||||
title=title,
|
||||
priority=NOTICE_PRIO_MAT_TEMPLATE,
|
||||
allow_dismiss=True,
|
||||
tooltip=tooltip,
|
||||
icon=self.icon_dcc_map[NOTICE_ICON_WARN],
|
||||
body=body,
|
||||
alert=True
|
||||
)
|
||||
if auto_enqueue:
|
||||
self.enqueue_notice(notice)
|
||||
return notice
|
||||
|
||||
def create_version_alert(self,
|
||||
title: str = NOTICE_TITLE_DEPRECATED,
|
||||
*,
|
||||
priority: int,
|
||||
label: str,
|
||||
tooltip: str,
|
||||
open_popup: bool,
|
||||
allow_dismiss: bool = True,
|
||||
auto_dismiss: bool = True,
|
||||
body: Optional[str] = None,
|
||||
action_string: Optional[str] = None,
|
||||
url: Optional[str] = None,
|
||||
auto_enqueue: bool = True
|
||||
) -> Notification:
|
||||
"""Returns a pre-built 'Version Alert' notice.
|
||||
|
||||
Note: An Alert Notification can be a NotificationPopup or a
|
||||
NotificationOpenUrl, depending on the given AlertData
|
||||
"""
|
||||
|
||||
if open_popup:
|
||||
notice = NotificationPopup(
|
||||
id_notice=NOTICE_ID_VERSION_ALERT,
|
||||
title=title,
|
||||
priority=priority,
|
||||
allow_dismiss=allow_dismiss,
|
||||
auto_dismiss=auto_dismiss,
|
||||
tooltip=tooltip,
|
||||
label=label,
|
||||
icon=self.icon_dcc_map[NOTICE_ICON_WARN],
|
||||
open_popup=open_popup,
|
||||
body=body,
|
||||
action_string=action_string
|
||||
)
|
||||
else:
|
||||
notice = NotificationOpenUrl(
|
||||
id_notice=NOTICE_ID_VERSION_ALERT,
|
||||
url=url,
|
||||
title=title,
|
||||
priority=priority,
|
||||
allow_dismiss=allow_dismiss,
|
||||
auto_dismiss=auto_dismiss,
|
||||
tooltip=tooltip,
|
||||
label=label,
|
||||
icon=self.icon_dcc_map[NOTICE_ICON_WARN]
|
||||
)
|
||||
|
||||
if auto_enqueue:
|
||||
self.enqueue_notice(notice)
|
||||
return notice
|
||||
|
||||
def create_write_settings_error(self,
|
||||
title: str = NOTICE_TITLE_SETTINGS_WRITE,
|
||||
*,
|
||||
tooltip: str,
|
||||
body: str,
|
||||
auto_enqueue: bool = True
|
||||
) -> Notification:
|
||||
"""Returns a pre-built 'write settings error' notice."""
|
||||
|
||||
notice = NotificationPopup(
|
||||
id_notice=NOTICE_ID_SETTINGS_WRITE,
|
||||
title=title,
|
||||
priority=NOTICE_PRIO_SETTINGS_WRITE,
|
||||
allow_dismiss=True,
|
||||
tooltip=tooltip,
|
||||
icon=self.icon_dcc_map[NOTICE_ICON_WARN],
|
||||
body=body,
|
||||
alert=True
|
||||
)
|
||||
if auto_enqueue:
|
||||
self.enqueue_notice(notice)
|
||||
return notice
|
||||
|
||||
def create_update(self,
|
||||
title: str = NOTICE_TITLE_UPDATE,
|
||||
*,
|
||||
tooltip: str,
|
||||
label: str,
|
||||
download_url: str,
|
||||
download_label: str = "",
|
||||
logs_url: str = "",
|
||||
logs_label: str = "",
|
||||
auto_enqueue: bool = True,
|
||||
open_popup: bool = True,
|
||||
auto_dismiss: bool = True,
|
||||
action_string: Optional[str] = None,
|
||||
body: Optional[str] = None,
|
||||
action_callable: Optional[Callable] = None
|
||||
) -> Notification:
|
||||
"""Returns a pre-built 'Update available' notice."""
|
||||
|
||||
if action_string is None:
|
||||
action_string = self.addon_params.update_action_text
|
||||
if body is None:
|
||||
body = self.addon_params.update_body
|
||||
if action_callable is None:
|
||||
action_callable = self.addon_params.update_callable
|
||||
|
||||
notice = NotificationUpdateReady(
|
||||
id_notice=NOTICE_ID_UPDATE,
|
||||
title=title,
|
||||
priority=NOTICE_PRIO_UPDATE,
|
||||
allow_dismiss=True,
|
||||
auto_dismiss=auto_dismiss,
|
||||
tooltip=tooltip,
|
||||
download_url=download_url,
|
||||
download_label=download_label,
|
||||
label=label,
|
||||
logs_url=logs_url,
|
||||
logs_label=logs_label,
|
||||
icon=self.icon_dcc_map[NOTICE_ICON_INFO],
|
||||
open_popup=open_popup,
|
||||
action_string=action_string,
|
||||
body=body,
|
||||
action_callable=action_callable
|
||||
)
|
||||
if auto_enqueue:
|
||||
self.enqueue_notice(notice)
|
||||
return notice
|
||||
@@ -0,0 +1,555 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime
|
||||
from enum import Enum
|
||||
import html
|
||||
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from .notifications import Notification, ActionType
|
||||
from .api import STR_NO_PLAN
|
||||
|
||||
SIGNAL_PLAN_SUBSCRIBE_EDU = "PLAN_SUBSCRIBE_EDU"
|
||||
SIGNAL_PLAN_SUBSCRIBE = "PLAN_SUBSCRIBE"
|
||||
SIGNAL_PLAN_UPGRADE_NO_DLS = "PLAN_UPGRADE_NO_DLS"
|
||||
SIGNAL_PLAN_PROMPT_UNLIMITED = "PLAN_PROMPT_UNLIMITED"
|
||||
SIGNAL_PLAN_RESUME_PAUSED = "PLAN_RESUME_PAUSED"
|
||||
SIGNAL_PLAN_RESUME_CANCELLATION = "PLAN_RESUME_SCHEDULED_CANCEL"
|
||||
SIGNAL_PLAN_RESUME_SCHEDULED_PAUSE = "PLAN_RESUME_SCHEDULED_PAUSE"
|
||||
|
||||
|
||||
class SubscriptionState(Enum):
|
||||
"""Values for allowed user subscription states."""
|
||||
NOT_POPULATED = 0
|
||||
FREE = 1,
|
||||
ACTIVE = 2,
|
||||
PAUSED = 3,
|
||||
PAUSE_SCHEDULED = 4,
|
||||
CANCELLED = 4
|
||||
|
||||
|
||||
class PlanUpgradeStatus(Enum):
|
||||
NOT_POPULATED = 0
|
||||
STUDENT_DISCOUNT = 1
|
||||
TEACHER_DISCOUNT = 2
|
||||
BECOME_PRO = 3
|
||||
UPGRADE_PLAN_BALANCE = 4
|
||||
RESUME_PLAN = 5
|
||||
REMOVE_SCHEDULED_PAUSE = 6
|
||||
REMOVE_CANCELLATION = 7
|
||||
UPGRADE_PLAN_UNLIMITED = 8
|
||||
NO_UPGRADE_AVAILABLE = 9
|
||||
|
||||
def get_signal_string(self) -> Optional[str]:
|
||||
if self == self.NOT_POPULATED:
|
||||
return None
|
||||
elif self in [self.STUDENT_DISCOUNT, self.TEACHER_DISCOUNT]:
|
||||
return SIGNAL_PLAN_SUBSCRIBE_EDU
|
||||
elif self == self.BECOME_PRO:
|
||||
return SIGNAL_PLAN_SUBSCRIBE
|
||||
elif self == self.UPGRADE_PLAN_BALANCE:
|
||||
return SIGNAL_PLAN_UPGRADE_NO_DLS
|
||||
elif self == self.RESUME_PLAN:
|
||||
return SIGNAL_PLAN_RESUME_PAUSED
|
||||
elif self == self.REMOVE_SCHEDULED_PAUSE:
|
||||
return SIGNAL_PLAN_RESUME_SCHEDULED_PAUSE
|
||||
elif self == self.REMOVE_CANCELLATION:
|
||||
return SIGNAL_PLAN_RESUME_CANCELLATION
|
||||
elif self == self.UPGRADE_PLAN_UNLIMITED:
|
||||
return SIGNAL_PLAN_PROMPT_UNLIMITED
|
||||
else:
|
||||
return None
|
||||
|
||||
|
||||
def _decode_currency_symbol(currency_str: str) -> str:
|
||||
decoded_str = ""
|
||||
chars = currency_str.split(";")
|
||||
for _char in chars:
|
||||
# Processing chrs in html format (e.g "82;$" => R$)
|
||||
try:
|
||||
int_char = int(_char)
|
||||
_char = chr(int_char)
|
||||
except ValueError:
|
||||
_char = html.unescape(_char)
|
||||
if len(_char) != 1:
|
||||
_char = ""
|
||||
decoded_str += _char
|
||||
return decoded_str
|
||||
|
||||
|
||||
@dataclass
|
||||
class PoliigonPlanUpgradeInfo:
|
||||
ok: bool
|
||||
error: Optional[str] = None
|
||||
|
||||
action: Optional[str] = None
|
||||
amount_due: Optional[str] = None
|
||||
amount_due_renewal: Optional[str] = None
|
||||
renewal_date: Optional[str] = None
|
||||
tax_rate: Optional[int] = None
|
||||
currency_code: Optional[str] = None
|
||||
currency_symbol: Optional[str] = None
|
||||
previous_assets: Optional[int] = None
|
||||
new_assets: Optional[int] = None
|
||||
previous_users: Optional[int] = None
|
||||
new_users: Optional[int] = None
|
||||
|
||||
@classmethod
|
||||
def from_dict(cls, dictionary: Dict):
|
||||
new = cls(ok=True)
|
||||
|
||||
new.action = dictionary.get("action")
|
||||
new.amount_due = dictionary.get("amount_due")
|
||||
new.amount_due_renewal = dictionary.get("amount_due_renewal")
|
||||
new.renewal_date = dictionary.get("renewal_date")
|
||||
new.tax_rate = dictionary.get("tax_rate")
|
||||
new.currency_code = dictionary.get("currency_code")
|
||||
new.currency_symbol = _decode_currency_symbol(
|
||||
dictionary.get("currency_symbol"))
|
||||
new.previous_assets = dictionary.get("previous_assets")
|
||||
if isinstance(new.previous_assets, str):
|
||||
new.previous_assets = new.previous_assets.title()
|
||||
new.new_assets = dictionary.get("new_assets")
|
||||
if isinstance(new.new_assets, str):
|
||||
new.new_assets = new.new_assets.title()
|
||||
new.previous_users = dictionary.get("previous_users")
|
||||
new.new_users = dictionary.get("new_users")
|
||||
|
||||
return new
|
||||
|
||||
|
||||
@dataclass
|
||||
class PoliigonSubscription:
|
||||
"""Container object for a subscription."""
|
||||
|
||||
plan_name: Optional[str] = None
|
||||
plan_credit: Optional[int] = None
|
||||
next_credit_renewal_date: Optional[datetime] = None
|
||||
current_term_end: Optional[datetime] = None
|
||||
next_subscription_renewal_date: Optional[datetime] = None
|
||||
plan_paused_at: Optional[datetime] = None
|
||||
plan_paused_until: Optional[datetime] = None
|
||||
subscription_state: Optional[SubscriptionState] = SubscriptionState.NOT_POPULATED
|
||||
period_unit: Optional[str] = None # e.g. per "month" or "year" for renewing
|
||||
plan_price_id: Optional[str] = None
|
||||
plan_price: Optional[str] = None # e.g. "123"
|
||||
currency_code: Optional[str] = None # e.g. "USD"
|
||||
base_price: Optional[float] = None # e.g. 123.45
|
||||
currency_symbol: Optional[str] = None # e.g. "$" (special character)
|
||||
is_unlimited: Optional[bool] = None
|
||||
has_team: Optional[bool] = None
|
||||
|
||||
@staticmethod
|
||||
def _to_float(value: Optional[str]) -> Optional[float]:
|
||||
if value is None:
|
||||
return None
|
||||
|
||||
# Replacing commas from the string, so they can be formatted as floats.
|
||||
# For some users, this value can be formatted as "1,000"
|
||||
if isinstance(value, str) and "," in value:
|
||||
value.replace(",", "")
|
||||
|
||||
try:
|
||||
return float(value)
|
||||
except (ValueError, TypeError):
|
||||
return None
|
||||
|
||||
def update_from_upgrade_dict(self, plan_dictionary: Dict) -> Any: # Returns an instance of the class
|
||||
"""Creates a class instance from an API V2 available plans
|
||||
response plan dictionary."""
|
||||
|
||||
if plan_dictionary.get("name") and plan_dictionary["name"] != STR_NO_PLAN:
|
||||
self.plan_name = plan_dictionary["name"]
|
||||
self.plan_credit = plan_dictionary.get("meta", {}).get("credits")
|
||||
|
||||
self.period_unit = plan_dictionary.get("periodUnit", None)
|
||||
self.plan_price_id = plan_dictionary.get("id", None)
|
||||
self.plan_price = plan_dictionary.get("price", None)
|
||||
|
||||
self.currency_code = plan_dictionary.get("currencyCode", None)
|
||||
self.base_price = self._to_float(plan_dictionary.get("basePrice", None))
|
||||
|
||||
self.currency_symbol = _decode_currency_symbol(
|
||||
plan_dictionary.get("currency_symbol", ""))
|
||||
|
||||
self.is_unlimited = bool(plan_dictionary.get("meta", {}).get("unlimited"))
|
||||
self.has_team = bool(plan_dictionary.get("meta", {}).get("hasTeams"))
|
||||
else:
|
||||
self.plan_name = None
|
||||
self.plan_credit = None
|
||||
self.next_subscription_renewal_date = None
|
||||
self.next_credit_renewal_date = None
|
||||
self.subscription_state = SubscriptionState.FREE
|
||||
self.period_unit = None
|
||||
self.plan_price_id = None
|
||||
self.plan_price = None
|
||||
self.currency_code = None
|
||||
self.base_price = None
|
||||
self.currency_symbol = None
|
||||
|
||||
def update_from_dict(self, plan_dictionary: Dict) -> Any: # Returns an instance of the class
|
||||
"""TCreates a class instance from an API V1 Subscription Data (Some API
|
||||
V2 still returning with same structure as API V1 - e.g. put_upgrade_plan).
|
||||
"""
|
||||
|
||||
if plan_dictionary.get("plan_name") and plan_dictionary["plan_name"] != STR_NO_PLAN:
|
||||
self.plan_name = plan_dictionary["plan_name"]
|
||||
self.plan_credit = plan_dictionary.get("plan_credit", None)
|
||||
|
||||
# Extract "2022-08-19" from "2022-08-19 23:58:37"
|
||||
renew = plan_dictionary.get("next_subscription_renewal_date", None)
|
||||
try:
|
||||
renew = datetime.strptime(renew, "%Y-%m-%d %H:%M:%S")
|
||||
self.next_subscription_renewal_date = renew
|
||||
except (ValueError, TypeError):
|
||||
self.next_subscription_renewal_date = None
|
||||
|
||||
end_plan = plan_dictionary.get("current_term_end", None)
|
||||
try:
|
||||
end_plan = datetime.strptime(end_plan, "%Y-%m-%d %H:%M:%S")
|
||||
self.current_term_end = end_plan
|
||||
except (ValueError, TypeError):
|
||||
self.current_term_end = None
|
||||
|
||||
next_credits = plan_dictionary.get("next_credit_renewal_date", None)
|
||||
try:
|
||||
next_credits = datetime.strptime(
|
||||
next_credits, "%Y-%m-%d %H:%M:%S")
|
||||
self.next_credit_renewal_date = next_credits
|
||||
except (ValueError, TypeError):
|
||||
self.next_credit_renewal_date = None
|
||||
|
||||
paused_plan_info = plan_dictionary.get("paused_info", None)
|
||||
not_renewing = self.next_subscription_renewal_date is None
|
||||
if paused_plan_info is not None:
|
||||
self.subscription_state = SubscriptionState.PAUSED
|
||||
paused_date = paused_plan_info.get("pause_date", None)
|
||||
resume_date = paused_plan_info.get("resume_date", None)
|
||||
try:
|
||||
self.plan_paused_at = datetime.strptime(
|
||||
paused_date, "%Y-%m-%d %H:%M:%S")
|
||||
self.plan_paused_until = datetime.strptime(
|
||||
resume_date, "%Y-%m-%d %H:%M:%S")
|
||||
|
||||
now = datetime.now()
|
||||
if now < self.plan_paused_at or now > self.plan_paused_until:
|
||||
self.subscription_state = SubscriptionState.PAUSE_SCHEDULED
|
||||
except (ValueError, TypeError):
|
||||
self.plan_paused_until = None
|
||||
self.plan_paused_at = None
|
||||
elif not_renewing:
|
||||
self.subscription_state = SubscriptionState.CANCELLED
|
||||
else:
|
||||
self.plan_paused_until = None
|
||||
self.plan_paused_at = None
|
||||
self.subscription_state = SubscriptionState.ACTIVE
|
||||
|
||||
self.period_unit = plan_dictionary.get("period_unit", None)
|
||||
self.plan_price_id = plan_dictionary.get("plan_price_id", None)
|
||||
|
||||
self.plan_price = plan_dictionary.get("plan_price", None)
|
||||
self.currency_code = plan_dictionary.get("currency_code", None)
|
||||
|
||||
self.base_price = self._to_float(plan_dictionary.get("base_price", None))
|
||||
self.currency_symbol = _decode_currency_symbol(
|
||||
plan_dictionary.get("currency_symbol", ""))
|
||||
|
||||
unlimited = plan_dictionary.get("unlimited", None)
|
||||
if unlimited is not None:
|
||||
self.is_unlimited = bool(unlimited)
|
||||
|
||||
has_team = bool(plan_dictionary.get("team_id", None))
|
||||
if has_team is not None:
|
||||
self.has_team = bool(has_team)
|
||||
else:
|
||||
self.plan_name = None
|
||||
self.plan_credit = None
|
||||
self.next_subscription_renewal_date = None
|
||||
self.next_credit_renewal_date = None
|
||||
self.subscription_state = SubscriptionState.FREE
|
||||
self.period_unit = None
|
||||
self.plan_price_id = None
|
||||
self.plan_price = None
|
||||
self.currency_code = None
|
||||
self.base_price = None
|
||||
self.currency_symbol = None
|
||||
|
||||
|
||||
class PoliigonPlanUpgradeManager:
|
||||
available_plans: List[Any] # List[PoliigonSubscription]
|
||||
upgrade_plan: Optional[Any] = None # Optional[PoliigonSubscription]
|
||||
status: Optional[PlanUpgradeStatus] = PlanUpgradeStatus.NOT_POPULATED
|
||||
|
||||
upgrade_info: Optional[PoliigonPlanUpgradeInfo] = None
|
||||
show_banner: bool = False
|
||||
upgrade_dismissed: bool = False
|
||||
banner_status_emitted: Optional[PlanUpgradeStatus] = None
|
||||
|
||||
# Upgrade banner and popups content to be used on DCC UI;
|
||||
content: Optional[Any] = None # UpgradeContent (Circular Import)
|
||||
|
||||
def __init__(self,
|
||||
addon: Any # PoliigonAddon
|
||||
):
|
||||
self.addon = addon
|
||||
self.user = self.addon.user
|
||||
self.available_plans = []
|
||||
self.set_upgrade_status()
|
||||
|
||||
def refresh(self,
|
||||
plans_info: Optional[Dict] = None,
|
||||
only_resume_popup: bool = False,
|
||||
clean_plans: bool = False
|
||||
) -> None:
|
||||
self.user = self.addon.user
|
||||
if clean_plans:
|
||||
self.available_plans = []
|
||||
if plans_info is not None:
|
||||
self.set_available_plans(plans_info)
|
||||
self.set_upgrade_plan()
|
||||
self.set_upgrade_status()
|
||||
self.set_show_banner()
|
||||
if self.content is not None:
|
||||
self.content.refresh(self, only_resume_popup)
|
||||
|
||||
def get_last_dismiss(self) -> Optional[datetime]:
|
||||
last_dismiss = self.addon.settings_config.get(
|
||||
"upgrade", "last_dismiss", fallback=None)
|
||||
if last_dismiss is None:
|
||||
return None
|
||||
return datetime.strptime(last_dismiss, "%Y-%m-%d %H:%M:%S")
|
||||
|
||||
def check_last_dismiss_interval(self, day_interval: int = 7) -> bool:
|
||||
last_dismiss = self.get_last_dismiss()
|
||||
if last_dismiss is None:
|
||||
return True
|
||||
|
||||
diff = datetime.now() - last_dismiss
|
||||
if diff.days >= day_interval:
|
||||
return True
|
||||
return False
|
||||
|
||||
def set_show_banner(self) -> None:
|
||||
if self.user is None:
|
||||
return
|
||||
if self.addon.user.credits is None:
|
||||
return
|
||||
upgrade_available = None not in [self.upgrade_info, self.upgrade_plan]
|
||||
if self.status in [PlanUpgradeStatus.NOT_POPULATED,
|
||||
PlanUpgradeStatus.NO_UPGRADE_AVAILABLE]:
|
||||
self.show_banner = False
|
||||
elif self.status in [PlanUpgradeStatus.STUDENT_DISCOUNT,
|
||||
PlanUpgradeStatus.TEACHER_DISCOUNT,
|
||||
PlanUpgradeStatus.BECOME_PRO,
|
||||
PlanUpgradeStatus.RESUME_PLAN,
|
||||
PlanUpgradeStatus.REMOVE_SCHEDULED_PAUSE,
|
||||
PlanUpgradeStatus.REMOVE_CANCELLATION]:
|
||||
self.show_banner = True
|
||||
elif self.status == PlanUpgradeStatus.UPGRADE_PLAN_BALANCE:
|
||||
self.show_banner = upgrade_available
|
||||
elif self.status == PlanUpgradeStatus.UPGRADE_PLAN_UNLIMITED:
|
||||
if self.addon.user.plan.plan_credit is None or not self.check_last_dismiss_interval():
|
||||
self.show_banner = False
|
||||
else:
|
||||
self.show_banner = upgrade_available
|
||||
else:
|
||||
self.show_banner = False
|
||||
|
||||
def emit_signal(self,
|
||||
view: bool = False,
|
||||
dismiss: bool = False,
|
||||
clicked: bool = False) -> None:
|
||||
if self.status is None:
|
||||
return
|
||||
|
||||
signal_str = self.status.get_signal_string()
|
||||
|
||||
if signal_str is None:
|
||||
return
|
||||
|
||||
action_type = ActionType.OPEN_URL
|
||||
if self.content is not None and self.content.open_popup:
|
||||
action_type = ActionType.POPUP_MESSAGE
|
||||
|
||||
# Mocked Notification to be used for signals
|
||||
signal_notice = Notification(id_notice=signal_str,
|
||||
title=self.status.name,
|
||||
priority=0,
|
||||
label=self.status.name)
|
||||
signal_notice.action = action_type
|
||||
|
||||
if dismiss and not self.upgrade_dismissed:
|
||||
self.addon.notify._signal_dismiss(signal_notice)
|
||||
elif view:
|
||||
self.addon.notify._signal_view(signal_notice)
|
||||
elif clicked:
|
||||
self.addon.notify._signal_clicked(signal_notice)
|
||||
|
||||
def check_show_banner(self) -> bool:
|
||||
if self.user is None:
|
||||
return False
|
||||
do_show_banner = self.show_banner
|
||||
|
||||
# Checks if the status changed since the last view signal
|
||||
different_emit_signal_status = self.banner_status_emitted != self.status
|
||||
if do_show_banner and different_emit_signal_status:
|
||||
self.emit_signal(view=True)
|
||||
self.banner_status_emitted = self.status
|
||||
|
||||
return do_show_banner
|
||||
|
||||
def dismiss_upgrade(self) -> None:
|
||||
date_now = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
|
||||
self.addon.settings_config.set(
|
||||
"upgrade", "last_dismiss", date_now)
|
||||
self.set_show_banner()
|
||||
self.addon._settings.save_settings()
|
||||
self.emit_signal(dismiss=True)
|
||||
self.upgrade_dismissed = True
|
||||
|
||||
def set_upgrade_status(self) -> None:
|
||||
if self.user is None:
|
||||
return
|
||||
if self.user.credits is None:
|
||||
return
|
||||
subscription_state = self.user.plan.subscription_state
|
||||
if subscription_state == SubscriptionState.FREE:
|
||||
if self.user.is_student:
|
||||
self.status = PlanUpgradeStatus.STUDENT_DISCOUNT
|
||||
elif self.user.is_teacher:
|
||||
self.status = PlanUpgradeStatus.TEACHER_DISCOUNT
|
||||
else:
|
||||
self.status = PlanUpgradeStatus.BECOME_PRO
|
||||
elif subscription_state == SubscriptionState.PAUSED:
|
||||
self.status = PlanUpgradeStatus.RESUME_PLAN
|
||||
elif subscription_state == SubscriptionState.PAUSE_SCHEDULED:
|
||||
self.status = PlanUpgradeStatus.REMOVE_SCHEDULED_PAUSE
|
||||
elif subscription_state == SubscriptionState.CANCELLED:
|
||||
self.status = PlanUpgradeStatus.REMOVE_CANCELLATION
|
||||
elif subscription_state == SubscriptionState.ACTIVE:
|
||||
if self.upgrade_plan is None:
|
||||
self.status = PlanUpgradeStatus.NO_UPGRADE_AVAILABLE
|
||||
elif self.user.credits == 0:
|
||||
self.status = PlanUpgradeStatus.UPGRADE_PLAN_BALANCE
|
||||
elif self.upgrade_plan.is_unlimited:
|
||||
self.status = PlanUpgradeStatus.UPGRADE_PLAN_UNLIMITED
|
||||
else:
|
||||
self.status = PlanUpgradeStatus.NO_UPGRADE_AVAILABLE
|
||||
else:
|
||||
self.status = PlanUpgradeStatus.NO_UPGRADE_AVAILABLE
|
||||
|
||||
def set_available_plans(self, plans_dict: Dict) -> None:
|
||||
yearly_plans = plans_dict.get("plan_year", [])
|
||||
monthly_plans = plans_dict.get("plan_month", [])
|
||||
|
||||
# Clean available plans before populating again
|
||||
self.available_plans = []
|
||||
|
||||
for _plan in (yearly_plans + monthly_plans):
|
||||
plan_data = PoliigonSubscription()
|
||||
plan_data.update_from_upgrade_dict(_plan)
|
||||
self.available_plans.append(plan_data)
|
||||
|
||||
def set_upgrade_plan(self) -> None:
|
||||
"""Method to define what is the next plan to offer to the user.
|
||||
|
||||
We have two main scenarios for upgrading:
|
||||
Upgrade to Pro Plan: If a given user has a next Pro plan available and
|
||||
(only if) their credits are empty, we offer the next pro plan
|
||||
(not dismissible);
|
||||
|
||||
Upgrade to Unlimited:
|
||||
For any Pro plan user that credits are more than zero, we should show
|
||||
the Upgrade to Unlimited banner (this one is dismissible);"""
|
||||
|
||||
if self.user is None or len(self.available_plans) == 0:
|
||||
return
|
||||
|
||||
if self.user.plan.is_unlimited:
|
||||
# The only benefit to upgrade is to get more downloads, if you're
|
||||
# already unlimited, there's nothing to upgrade to
|
||||
return
|
||||
|
||||
if self.user.plan.has_team:
|
||||
# Let's not offer updates to team members, since these contracts
|
||||
# are handled separately
|
||||
return
|
||||
|
||||
upgrade_pro_plan = None
|
||||
upgrade_unlimited_plan = None
|
||||
filter_period_unit = [_plan for _plan in self.available_plans
|
||||
if _plan.period_unit == self.user.plan.period_unit]
|
||||
|
||||
filter_has_team = [_plan for _plan in filter_period_unit
|
||||
if _plan.has_team == self.user.plan.has_team]
|
||||
|
||||
sorted_price_plans = sorted(filter_has_team, key=lambda plan: plan.plan_credit)
|
||||
for _plan in sorted_price_plans:
|
||||
if self.user.plan.is_unlimited and not _plan.is_unlimited:
|
||||
# Don't offer credit-based plans if they are already unlimited
|
||||
continue
|
||||
|
||||
if self.user.plan.plan_credit >= _plan.plan_credit and not _plan.is_unlimited:
|
||||
# Don't offer plans which have the same or fewer credits
|
||||
continue
|
||||
|
||||
if _plan.is_unlimited and upgrade_unlimited_plan is None:
|
||||
upgrade_unlimited_plan = _plan
|
||||
if upgrade_pro_plan is None:
|
||||
upgrade_pro_plan = _plan
|
||||
if None not in [upgrade_unlimited_plan, upgrade_pro_plan]:
|
||||
break
|
||||
|
||||
if upgrade_pro_plan is None and upgrade_unlimited_plan is None:
|
||||
self.upgrade_plan = None
|
||||
elif self.user.credits == 0 and upgrade_pro_plan is not None:
|
||||
self.upgrade_plan = upgrade_pro_plan
|
||||
else:
|
||||
self.upgrade_plan = upgrade_unlimited_plan
|
||||
|
||||
def finish_upgrade_plan(self) -> None:
|
||||
"""This method should be called to confirm an update, resume or
|
||||
choosing plan in the addon dcc side"""
|
||||
if self.addon.api_rc is None:
|
||||
self.addon.logger.error("API RC not defined")
|
||||
return
|
||||
|
||||
choose_plans_status = [PlanUpgradeStatus.STUDENT_DISCOUNT,
|
||||
PlanUpgradeStatus.TEACHER_DISCOUNT,
|
||||
PlanUpgradeStatus.BECOME_PRO]
|
||||
|
||||
resume_status = [PlanUpgradeStatus.RESUME_PLAN,
|
||||
PlanUpgradeStatus.REMOVE_SCHEDULED_PAUSE,
|
||||
PlanUpgradeStatus.REMOVE_CANCELLATION]
|
||||
|
||||
upgrade_status = [PlanUpgradeStatus.UPGRADE_PLAN_BALANCE,
|
||||
PlanUpgradeStatus.UPGRADE_PLAN_UNLIMITED]
|
||||
|
||||
if self.status in choose_plans_status:
|
||||
self.addon._api.open_poliigon_link("subscribe")
|
||||
elif self.status in resume_status:
|
||||
callback = self.addon.api_rc._addon_params.callback_resume_plan
|
||||
self.addon.api_rc.add_job_resume_plan(callback_done=callback)
|
||||
elif self.status in upgrade_status:
|
||||
callback = self.addon.api_rc._addon_params.callback_put_upgrade_plan
|
||||
self.addon.api_rc.add_job_put_upgrade_plan(callback_done=callback)
|
||||
else:
|
||||
self.addon.logger.error(
|
||||
f"Current user not available for upgrade: {self.user}")
|
||||
BIN
Binary file not shown.
BIN
Binary file not shown.
@@ -0,0 +1,133 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
import os
|
||||
|
||||
try:
|
||||
import ConfigParser
|
||||
except Exception:
|
||||
import configparser as ConfigParser
|
||||
|
||||
|
||||
class PoliigonSettings():
|
||||
"""Settings used for the addon."""
|
||||
|
||||
addon_name: str # e.g. poliigon-addon-3dsmax
|
||||
base: str # Path to base directory of addon or package
|
||||
software_source: str # e.g. blender
|
||||
settings_filename: str
|
||||
|
||||
config: ConfigParser.ConfigParser = None
|
||||
|
||||
def __init__(self,
|
||||
addon_name: str,
|
||||
software_source: str,
|
||||
base: str = os.path.join(os.path.expanduser("~"), "Poliigon"),
|
||||
settings_filename: str = "settings.ini"):
|
||||
self.addon_name = addon_name
|
||||
self.base = os.path.join(base, software_source)
|
||||
self.settings_filename = settings_filename
|
||||
self.get_settings()
|
||||
|
||||
def _ensure_sections_exist(self):
|
||||
sections = [
|
||||
"download",
|
||||
"library",
|
||||
"update",
|
||||
"logging",
|
||||
"purchase",
|
||||
"import",
|
||||
"onboarding",
|
||||
"ui",
|
||||
"user",
|
||||
"upgrade"
|
||||
]
|
||||
_ = [
|
||||
self.config.add_section(sec)
|
||||
for sec in sections
|
||||
if not self.config.has_section(sec)
|
||||
]
|
||||
|
||||
def _populate_default_settings(self):
|
||||
self._ensure_sections_exist()
|
||||
|
||||
defaults = {
|
||||
"download": {
|
||||
"brush": "2K",
|
||||
"download_lods": "true",
|
||||
"hdri_bg": "8K",
|
||||
"hdri_light": "1K",
|
||||
"lod": "NONE",
|
||||
"model_res": "NONE",
|
||||
"tex_res": "2K"
|
||||
},
|
||||
"map_preferences": {},
|
||||
"library": {
|
||||
"primary": ""
|
||||
},
|
||||
"directories": {},
|
||||
"logging": {
|
||||
"reporting_opt_in": "true",
|
||||
"verbose_logs": "true"
|
||||
},
|
||||
"purchase": {
|
||||
"auto_download": "true"
|
||||
},
|
||||
"user": {
|
||||
"token": "",
|
||||
"first_local_asset": ""
|
||||
}
|
||||
}
|
||||
|
||||
for section_name, section_defaults in defaults.items():
|
||||
if len(section_defaults.items()) == 0:
|
||||
self.config.add_section(section_name)
|
||||
continue
|
||||
for option, value in section_defaults.items():
|
||||
self.config.set(section_name, option, value)
|
||||
|
||||
def get_settings(self):
|
||||
# https://docs.python.org/3/library/configparser.html#configparser.ConfigParser.optionxform
|
||||
self.config = ConfigParser.ConfigParser()
|
||||
self.config.optionxform = str
|
||||
|
||||
self._populate_default_settings()
|
||||
|
||||
settings_file = os.path.join(self.base, self.settings_filename)
|
||||
if os.path.exists(settings_file):
|
||||
try:
|
||||
self.config.read(settings_file)
|
||||
except ValueError as e:
|
||||
print(f"Could not load settings for {self.addon_name}!")
|
||||
print(e)
|
||||
|
||||
def save_settings(self):
|
||||
if self.config is None:
|
||||
print(f"No settings found for {self.addon_name}! Initializing...")
|
||||
self.get_settings()
|
||||
|
||||
if not os.path.exists(self.base):
|
||||
try:
|
||||
os.makedirs(self.base)
|
||||
except Exception as e:
|
||||
print("Failed to create directory: ", e)
|
||||
raise
|
||||
|
||||
settings_file = os.path.join(self.base, self.settings_filename)
|
||||
with open(settings_file, "w+") as f:
|
||||
self.config.write(f)
|
||||
@@ -0,0 +1,213 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
"""Module for thread management and thread queues for Poliigon software."""
|
||||
|
||||
from typing import Dict, List, Optional, Union, Callable
|
||||
from concurrent.futures import (CancelledError,
|
||||
Future,
|
||||
ThreadPoolExecutor)
|
||||
from enum import Enum
|
||||
import functools
|
||||
import sys
|
||||
import traceback
|
||||
|
||||
|
||||
class PoolKeys(Enum):
|
||||
""" Enum for the different ways to label a thread."""
|
||||
INTERACTIVE = 0 # Should be the default and highest prempetive order
|
||||
PREVIEW_DL = 1 # Preview thumbnails should be second order
|
||||
ASSET_DL = 2 # Asset downloads lowest, don't occupy the 'last thread'
|
||||
MP = 3 # Mixpannel signaling
|
||||
|
||||
|
||||
def print_exc(fut: Future, key_pool: PoolKeys):
|
||||
"""Default function to print exceptions from pool thread's done handler."""
|
||||
|
||||
try:
|
||||
exc = fut.exception()
|
||||
except CancelledError:
|
||||
exc = None
|
||||
if exc is None:
|
||||
return
|
||||
print((f"=== ThreadManager[{key_pool.name}]: Thread Exception "
|
||||
f"({exc.__class__.__name__}): {exc}"))
|
||||
traceback.print_tb(exc.__traceback__)
|
||||
|
||||
|
||||
class ThreadManager:
|
||||
"""The class which manages state of the threads.
|
||||
|
||||
ThreadPools are created upon first use.
|
||||
|
||||
Number of threads per pool can be set "globally" upon creation
|
||||
of the ThreadPoolManager or per pool, when a pool is used the first time.
|
||||
|
||||
Decorator to be implemented in a class using the ThreadManager.
|
||||
Parameters pool and foreground are explained in detail for queue_thread().
|
||||
The code expects the ThreadManager instance in a member variable tm.
|
||||
Adapt as needed:
|
||||
|
||||
def run_threaded(key_pool: PoolKeys,
|
||||
max_threads: Optional[int] = None,
|
||||
foreground: bool = False) -> callable:
|
||||
# Schedule a function to run in a thread of a chosen pool
|
||||
def wrapped_func(func: callable) -> callable:
|
||||
@functools.wraps(func)
|
||||
def wrapped_func_call(self, *args, **kwargs):
|
||||
args = (self, ) + args
|
||||
return self.tm.queue_thread(func, key_pool, max_threads,
|
||||
foreground, *args, **kwargs)
|
||||
return wrapped_func_call
|
||||
return wrapped_func
|
||||
"""
|
||||
|
||||
max_threads: int # "global" max_threads, used if not overriden
|
||||
|
||||
thread_pools: Dict[PoolKeys, ThreadPoolExecutor] = {}
|
||||
|
||||
# function from the reporting addon side to report Sentry messages from
|
||||
# threaded functions. Expected to receive as parameter the function name
|
||||
# and a partial of the function to be threaded
|
||||
reporting_callable: Optional[Callable] = None
|
||||
|
||||
def __init__(self,
|
||||
max_threads: int = 10,
|
||||
callback_print_exc: Optional[Callable] = None,
|
||||
):
|
||||
"""Arguments:
|
||||
print_exc: Callable to be used instead of the default print_exc function.
|
||||
The callable needs to have the following interface:
|
||||
print_exc(fut: Future, key_pool: PoolKeys)
|
||||
Partial wrap if more parameters needed.
|
||||
"""
|
||||
|
||||
self.thread_pools = {}
|
||||
self.max_threads = max_threads
|
||||
if callback_print_exc is None:
|
||||
self.print_exc = print_exc
|
||||
else:
|
||||
self.print_exc = callback_print_exc
|
||||
|
||||
def get_pool(self,
|
||||
key_pool: PoolKeys,
|
||||
max_threads: Optional[int] = None,
|
||||
no_create: bool = False
|
||||
) -> Optional[ThreadPoolExecutor]:
|
||||
"""Returns the thread pool for a given key.
|
||||
|
||||
If the pool does not exist, yet, it will be created unless
|
||||
no_create is set to True, in which case None gets returned.
|
||||
|
||||
No need to call exernally.
|
||||
"""
|
||||
if key_pool in self.thread_pools:
|
||||
return self.thread_pools[key_pool]
|
||||
|
||||
if no_create:
|
||||
return None
|
||||
|
||||
if max_threads is None:
|
||||
max_threads = self.max_threads
|
||||
|
||||
tpe = ThreadPoolExecutor(max_workers=max_threads)
|
||||
self.thread_pools[key_pool] = tpe
|
||||
return tpe
|
||||
|
||||
def queue_thread(self,
|
||||
func: callable,
|
||||
key_pool: Optional[PoolKeys] = None,
|
||||
max_threads: Optional[int] = None,
|
||||
foreground: bool = False,
|
||||
*args, **kwargs) -> Union[Future, any]:
|
||||
"""Enqueue a function for threaded execution via a thread pool.
|
||||
|
||||
Parameters:
|
||||
key_pool: Selects the pool to be used, see PoolKeys enum.
|
||||
max_threads: The maximum number of threads can only be set once upon
|
||||
pool's first usage. It can not be changed later on.
|
||||
foreground: Set to True to have the function directly executed
|
||||
instead of being submitted to a thread pool.
|
||||
|
||||
Return value:
|
||||
Usually the Future belonging to a scheduled thread.
|
||||
If foreground option is used, it may actually be anything,
|
||||
as the return value of the function gets returned directly.
|
||||
"""
|
||||
if max_threads is None or max_threads <= 0:
|
||||
max_threads = self.max_threads
|
||||
|
||||
if key_pool is None:
|
||||
key_pool = PoolKeys.INTERACTIVE
|
||||
|
||||
report_func = None
|
||||
if self.reporting_callable is not None:
|
||||
partial_func = functools.partial(func, *args, **kwargs)
|
||||
report_func = self.reporting_callable(func.__name__, partial_func)
|
||||
|
||||
if foreground:
|
||||
# With foreground option the function gets called directly
|
||||
# NOTE: When using foreground option, the function returns
|
||||
# the return value of the called function instead of a Future
|
||||
if report_func is not None:
|
||||
fut = report_func()
|
||||
else:
|
||||
fut = func(*args, **kwargs)
|
||||
else:
|
||||
# Create ThreadPoolExecutor, if not already in thread_pools dict
|
||||
thread_pool = self.get_pool(key_pool, max_threads)
|
||||
|
||||
# Finally, kick the can
|
||||
# Schedule the function for threaded execution
|
||||
if report_func is not None:
|
||||
fut = thread_pool.submit(report_func)
|
||||
else:
|
||||
fut = thread_pool.submit(func, *args, **kwargs)
|
||||
|
||||
func_print = functools.partial(self.print_exc,
|
||||
key_pool=key_pool)
|
||||
fut.add_done_callback(func_print)
|
||||
|
||||
return fut
|
||||
|
||||
def shutdown(self,
|
||||
key_pool: Optional[PoolKeys] = None,
|
||||
wait: bool = True) -> None:
|
||||
"""Shutdown one or all (key_pool=None) ThreadPoolExecutors."""
|
||||
if key_pool is None:
|
||||
for tpe in self.thread_pools.values():
|
||||
if sys.version_info >= (3, 8, 0):
|
||||
tpe.shutdown(wait=wait, cancel_futures=True)
|
||||
else:
|
||||
tpe.shutdown(wait=wait)
|
||||
self.thread_pools = {}
|
||||
elif key_pool in self.thread_pools:
|
||||
self.thread_pools[key_pool].shutdown(wait=wait)
|
||||
del self.thread_pools[key_pool]
|
||||
|
||||
def pool_keys(self) -> List[PoolKeys]:
|
||||
"""Returns a list containing the pool keys of current pools."""
|
||||
return list(self.thread_pools.keys())
|
||||
|
||||
def number_of_pools(self) -> int:
|
||||
"""Returns the number of currently active ThreadPoolExecutors.
|
||||
|
||||
This does NOT mean, these ThreadPoolExecutors are currently
|
||||
actively executing threads.
|
||||
"""
|
||||
return len(self.thread_pools)
|
||||
@@ -0,0 +1,491 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
"""Module for general purpose updating for Poliigon software."""
|
||||
from typing import Dict, Optional, Sequence, Tuple, Callable, Any
|
||||
from dataclasses import dataclass
|
||||
import datetime
|
||||
import json
|
||||
import os
|
||||
import threading
|
||||
import requests
|
||||
from .multilingual import _m
|
||||
|
||||
from .notifications import (Notification,
|
||||
NotificationSystem,
|
||||
NOTICE_TITLE_UPDATE)
|
||||
|
||||
|
||||
BASE_URL = "https://software.poliigon.com"
|
||||
TIMEOUT = 20.0
|
||||
|
||||
# Status texts
|
||||
FAIL_GET_VERSIONS = _m("Failed to get versions")
|
||||
|
||||
|
||||
def v2t(value: str) -> tuple:
|
||||
"""Take a version string like v1.2.3 and convert it to a tuple."""
|
||||
if not value or "." not in value:
|
||||
return None
|
||||
if value.lower().startswith("v"):
|
||||
value = value[1:]
|
||||
return tuple([int(ind) for ind in value.split(".")])
|
||||
|
||||
|
||||
def t2v(ver: tuple) -> str:
|
||||
"""Take a tuple like (2, 80) and construct a string like v2.80."""
|
||||
return "v" + ".".join(list(ver))
|
||||
|
||||
|
||||
@dataclass
|
||||
class AlertData:
|
||||
title: Optional[str] = None
|
||||
label: Optional[str] = None
|
||||
body: Optional[str] = None
|
||||
url: Optional[str] = None
|
||||
priority: Optional[int] = None
|
||||
action_string: Optional[str] = None
|
||||
open_popup: Optional[bool] = None
|
||||
allow_dismiss: bool = True
|
||||
auto_dismiss: bool = True
|
||||
|
||||
valid: bool = True
|
||||
|
||||
def validate_field(
|
||||
self,
|
||||
value: Any,
|
||||
field: str,
|
||||
field_type: type,
|
||||
mandatory: bool,
|
||||
report_callable: Optional[Callable] = None):
|
||||
exists = value is not None
|
||||
ok_exists = (exists or not mandatory)
|
||||
ok_type = True
|
||||
if exists:
|
||||
ok_type = isinstance(value, field_type)
|
||||
if not ok_exists or not ok_type:
|
||||
self.valid = False
|
||||
if report_callable is not None:
|
||||
report_callable(
|
||||
"invalid_alert_information",
|
||||
f"Invalid {field} {value}",
|
||||
"error")
|
||||
|
||||
def validate_data(self, report_callable: Optional[Callable] = None) -> bool:
|
||||
rc = report_callable
|
||||
self.validate_field(self.title, "Title", str, True, rc)
|
||||
self.validate_field(self.label, "Label", str, True, rc)
|
||||
self.validate_field(self.priority, "Priority", int, True, rc)
|
||||
|
||||
self.validate_field(self.url, "Url", str, False, rc)
|
||||
self.validate_field(self.action_string, "Action String", str, False, rc)
|
||||
|
||||
self.validate_field(self.auto_dismiss, "Auto Dismiss", bool, False, rc)
|
||||
self.validate_field(self.allow_dismiss, "Allow Dismiss", bool, False, rc)
|
||||
|
||||
if self.url is None:
|
||||
# one of url or body have to be available with right format
|
||||
self.validate_field(self.body, "Body", str, True, rc)
|
||||
|
||||
def update_from_dict(
|
||||
self, data: Dict, report_callable: Optional[Callable] = None) -> None:
|
||||
self.title = data.get("title")
|
||||
self.label = data.get("label")
|
||||
self.body = data.get("body")
|
||||
self.url = data.get("url")
|
||||
self.action_string = data.get("action_string")
|
||||
self.priority = data.get("priority")
|
||||
|
||||
self.allow_dismiss = data.get("allow_dismiss", True)
|
||||
self.auto_dismiss = data.get("auto_dismiss", True)
|
||||
|
||||
if self.url is None or self.url == "":
|
||||
self.open_popup = True
|
||||
|
||||
self.validate_data(report_callable)
|
||||
|
||||
def create_notification(
|
||||
self, notification_system: NotificationSystem) -> Optional[Notification]:
|
||||
if not self.valid:
|
||||
return None
|
||||
notice = notification_system.create_version_alert(
|
||||
title=self.title,
|
||||
priority=self.priority,
|
||||
label=self.label,
|
||||
tooltip=self.label,
|
||||
body=self.body,
|
||||
action_string=self.action_string,
|
||||
url=self.url,
|
||||
open_popup=self.open_popup,
|
||||
allow_dismiss=self.allow_dismiss,
|
||||
auto_dismiss=self.auto_dismiss
|
||||
)
|
||||
|
||||
return notice
|
||||
|
||||
|
||||
@dataclass
|
||||
class VersionData:
|
||||
"""Container for a single version of the software."""
|
||||
version: Optional[tuple] = None
|
||||
url: Optional[str] = None
|
||||
min_software_version: Optional[tuple] = None # Inclusive.
|
||||
max_software_version: Optional[tuple] = None # Not inclusive.
|
||||
required: Optional[bool] = None
|
||||
release_timestamp: Optional[datetime.datetime] = None
|
||||
alert: Optional[AlertData] = None
|
||||
|
||||
# Internal, huamn readable current status.
|
||||
status_title: str = ""
|
||||
status_details: str = ""
|
||||
status_ok: bool = True
|
||||
|
||||
# Reporting rate for the version
|
||||
error_sample_rate: Optional[float] = None
|
||||
traces_sample_rate: Optional[float] = None
|
||||
|
||||
def update_from_dict(
|
||||
self, data: Dict, report_callable: Optional[Callable] = None) -> None:
|
||||
self.version = v2t(data.get("version"))
|
||||
self.url = data.get("url", "")
|
||||
|
||||
# List format like [2, 80]
|
||||
self.min_software_version = tuple(data.get("min_software_version"))
|
||||
self.max_software_version = tuple(data.get("max_software_version"))
|
||||
self.required = data.get("required")
|
||||
self.release_timestamp = data.get("release_timestamp")
|
||||
|
||||
alert_data = data.get("alert", None)
|
||||
if alert_data is not None:
|
||||
self.alert = AlertData()
|
||||
self.alert.update_from_dict(alert_data, report_callable)
|
||||
|
||||
self.error_sample_rate = (data.get("error_sample_rate"))
|
||||
self.traces_sample_rate = (data.get("traces_sample_rate"))
|
||||
|
||||
def create_alert_notification(
|
||||
self, notification_system: NotificationSystem) -> Optional[Notification]:
|
||||
|
||||
if self.alert is None or notification_system is None:
|
||||
return
|
||||
return self.alert.create_notification(notification_system)
|
||||
|
||||
def create_update_notification(
|
||||
self, notification_system: NotificationSystem) -> Optional[Notification]:
|
||||
if self.url is None or notification_system is None:
|
||||
return
|
||||
|
||||
version = str(self.version)
|
||||
version = version.replace(", ", ".")
|
||||
label = f"{NOTICE_TITLE_UPDATE} {version}"
|
||||
notice = notification_system.create_update(
|
||||
tooltip=NOTICE_TITLE_UPDATE,
|
||||
label=label,
|
||||
download_url=self.url
|
||||
)
|
||||
|
||||
return notice
|
||||
|
||||
|
||||
class SoftwareUpdater():
|
||||
"""Primary class which implements checks for updates and installs."""
|
||||
|
||||
# Versions of software available.
|
||||
stable: Optional[VersionData]
|
||||
latest: Optional[VersionData]
|
||||
all_versions: Sequence
|
||||
|
||||
# Always initialized
|
||||
addon_name: str # e.g. poliigon-addon-blender.
|
||||
addon_version: tuple # Current addon version.
|
||||
software_version: tuple # DCC software version, e.g. (3, 0).
|
||||
base_url: str # Primary url where updates and version data is hosted.
|
||||
|
||||
# State properties.
|
||||
update_ready: Optional[bool] = None # None until proven true or false.
|
||||
update_data: Optional[VersionData] = None
|
||||
_last_check: Optional[datetime.datetime] = None
|
||||
last_check_callback: Optional[Callable] = None # When last_check changes.
|
||||
check_interval: Optional[int] = None # interval in seconds between auto check.
|
||||
verbose: bool = True
|
||||
|
||||
# Classes to be imported from the addon
|
||||
notification_system: Optional[NotificationSystem] = None
|
||||
reporting_callable: Optional[Callable] = None
|
||||
|
||||
# Notifications
|
||||
alert_notice: Optional[Notification] = None
|
||||
update_notice: Optional[Notification] = None
|
||||
|
||||
# Bool value to be set by addon to take the update
|
||||
# data from the latest version instead of the stable one
|
||||
update_from_latest: bool = False
|
||||
|
||||
_check_thread: Optional[threading.Thread] = None
|
||||
|
||||
def __init__(self,
|
||||
addon_name: str,
|
||||
addon_version: tuple,
|
||||
software_version: tuple,
|
||||
base_url: Optional[str] = None,
|
||||
notification_system: Optional[NotificationSystem] = None,
|
||||
local_json: Optional[str] = None):
|
||||
self.addon_name = addon_name
|
||||
self.addon_version = addon_version
|
||||
self.notification_system = notification_system
|
||||
self.software_version = software_version
|
||||
self.base_url = base_url if base_url is not None else BASE_URL
|
||||
self.local_json = local_json
|
||||
self.current_version = VersionData()
|
||||
|
||||
self._clear_versions()
|
||||
|
||||
@property
|
||||
def is_checking(self) -> bool:
|
||||
"""Interface for other modules to see if a check for update running."""
|
||||
return self._check_thread and self._check_thread.is_alive()
|
||||
|
||||
@property
|
||||
def last_check(self) -> str:
|
||||
if not self._last_check:
|
||||
return ""
|
||||
try:
|
||||
return self._last_check.strftime("%Y-%m-%d %H:%M")
|
||||
except ValueError as err:
|
||||
print("Get last update check error:", err)
|
||||
return ""
|
||||
|
||||
@last_check.setter
|
||||
def last_check(self, value: str) -> None:
|
||||
try:
|
||||
self._last_check = datetime.datetime.strptime(
|
||||
value, "%Y-%m-%d %H:%M")
|
||||
except ValueError as err:
|
||||
print("Assign last update check error:", value, err)
|
||||
print(err)
|
||||
self._last_check = None
|
||||
if self.last_check_callback:
|
||||
self.last_check_callback(self.last_check) # The string version.
|
||||
|
||||
def _clear_versions(self) -> None:
|
||||
self.stable = None
|
||||
self.latest = None
|
||||
self.all_versions = []
|
||||
|
||||
def _clear_update(self) -> None:
|
||||
self.update_ready = None # Set to None until proven true or false.
|
||||
self.update_data = None
|
||||
self.status_ok = True
|
||||
|
||||
def has_time_elapsed(self, hours: int = 24) -> bool:
|
||||
"""Checks if a given number of hours have passed since last check."""
|
||||
now = datetime.datetime.now()
|
||||
if not self._last_check:
|
||||
return True # No check on record.
|
||||
diff = now - self._last_check
|
||||
return diff.total_seconds() / 3600.0 > hours
|
||||
|
||||
def print_debug(self, *args):
|
||||
if self.verbose:
|
||||
print(*args)
|
||||
|
||||
def update_versions(self) -> None:
|
||||
"""Fetch the latest versions available from the server."""
|
||||
self.status_ok = True # True until proven false.
|
||||
self._clear_versions()
|
||||
url = f"{self.base_url}/{self.addon_name}-versions.json"
|
||||
|
||||
try:
|
||||
res = requests.get(url, timeout=TIMEOUT)
|
||||
except requests.exceptions.ConnectionError:
|
||||
self.status_title = FAIL_GET_VERSIONS
|
||||
self.status_ok = False
|
||||
self.status_details = "Updater ConnectionError"
|
||||
return
|
||||
except requests.exceptions.Timeout:
|
||||
self.status_title = FAIL_GET_VERSIONS
|
||||
self.status_ok = False
|
||||
self.status_details = "Updater Timeout"
|
||||
return
|
||||
except requests.exceptions.ProxyError:
|
||||
self.status_title = FAIL_GET_VERSIONS
|
||||
self.status_ok = False
|
||||
self.status_details = "Updater ProxyError"
|
||||
return
|
||||
|
||||
if not res.ok:
|
||||
self.status_title = FAIL_GET_VERSIONS
|
||||
self.status_details = (
|
||||
"Did not get OK response while fetching available versions "
|
||||
f"from {url}")
|
||||
self.status_ok = False
|
||||
print(self.status_details)
|
||||
return
|
||||
if res.status_code != 200:
|
||||
self.status_title = FAIL_GET_VERSIONS
|
||||
self.status_details = (
|
||||
"Did not get OK code while fetching available versions")
|
||||
self.status_ok = False
|
||||
print(self.status_details)
|
||||
return
|
||||
|
||||
try:
|
||||
resp = json.loads(res.text)
|
||||
if self.local_json is not None and os.path.isfile(self.local_json):
|
||||
with open(self.local_json) as f:
|
||||
resp = json.load(f)
|
||||
except json.decoder.JSONDecodeError as e:
|
||||
self.status_title = FAIL_GET_VERSIONS
|
||||
self.status_details = "Could not parse json response for versions"
|
||||
self.status_ok = False
|
||||
self.status_is_error = True
|
||||
print(self.status_details)
|
||||
print(e)
|
||||
return
|
||||
|
||||
if resp.get("stable"):
|
||||
self.stable = VersionData()
|
||||
self.stable.update_from_dict(resp["stable"])
|
||||
if resp.get("latest"):
|
||||
self.latest = VersionData()
|
||||
self.latest.update_from_dict(resp["latest"])
|
||||
if resp.get("versions"):
|
||||
for itm in resp["versions"]:
|
||||
ver = VersionData()
|
||||
ver.update_from_dict(itm, self.reporting_callable)
|
||||
self.all_versions.append(ver)
|
||||
if ver.version == self.addon_version:
|
||||
self.current_version = ver
|
||||
|
||||
self._last_check = datetime.datetime.now()
|
||||
self.last_check = self.last_check # Trigger callback.
|
||||
|
||||
def _update_notification_msg(self) -> None:
|
||||
if self.notification_system is None:
|
||||
return
|
||||
|
||||
body = self.notification_system.addon_params.update_body
|
||||
if self.update_data is not None and body is not None:
|
||||
version = self.update_data.version
|
||||
version = ".".join(map(str, version))
|
||||
self.notification_system.addon_params.update_body = body.format(
|
||||
version)
|
||||
|
||||
def _create_notifications(self) -> Tuple[Notification, Notification]:
|
||||
alert_notif = None
|
||||
update_notif = None
|
||||
self._update_notification_msg()
|
||||
if self.current_version.alert is not None:
|
||||
alert_notif = self.current_version.create_alert_notification(
|
||||
self.notification_system)
|
||||
if self.update_data is not None:
|
||||
update_notif = self.update_data.create_update_notification(
|
||||
self.notification_system)
|
||||
return update_notif, alert_notif
|
||||
|
||||
def check_for_update(self,
|
||||
callback: Optional[callable] = None,
|
||||
create_notifications: bool = False) -> bool:
|
||||
"""Fetch and check versions to see if a new update is available."""
|
||||
self._clear_update()
|
||||
self.update_versions()
|
||||
|
||||
if not self.status_ok:
|
||||
if callback:
|
||||
callback()
|
||||
return False
|
||||
|
||||
# First compare against latest
|
||||
if self.stable and self._check_eligible(self.stable):
|
||||
update_version = self.stable
|
||||
if self.update_from_latest:
|
||||
update_version = self.latest
|
||||
|
||||
self.print_debug(
|
||||
"Using latest stable:",
|
||||
update_version.version,
|
||||
"vs current addon: ",
|
||||
self.addon_version)
|
||||
|
||||
if update_version.version > self.addon_version:
|
||||
self.update_data = update_version
|
||||
self.update_ready = True
|
||||
else:
|
||||
self.update_ready = False
|
||||
if create_notifications and self.notification_system is not None:
|
||||
self.update_notice, self.alert_notice = self._create_notifications()
|
||||
if callback:
|
||||
callback()
|
||||
return True
|
||||
|
||||
# Eligible wasn't present or more eligible, find next best.
|
||||
self.print_debug("Unable to use current stable release")
|
||||
max_version = self.get_max_eligible()
|
||||
if max_version:
|
||||
if max_version.version > self.addon_version:
|
||||
self.update_data = max_version
|
||||
self.update_ready = True
|
||||
else:
|
||||
self.update_ready = False
|
||||
else:
|
||||
self.print_debug("No eligible releases found")
|
||||
self.update_ready = False
|
||||
|
||||
if create_notifications and self.notification_system is not None:
|
||||
self.update_notice, self.alert_notice = self._create_notifications()
|
||||
if callback is not None:
|
||||
callback()
|
||||
return True
|
||||
|
||||
def _check_eligible(self, version: VersionData) -> bool:
|
||||
"""Verify if input version is compatible with the current software."""
|
||||
eligible = True
|
||||
if version.min_software_version:
|
||||
if self.software_version < version.min_software_version:
|
||||
eligible = False
|
||||
elif version.max_software_version:
|
||||
# Inclusive so that if max is 3.0, must be 2.99 or lower.
|
||||
if self.software_version >= version.max_software_version:
|
||||
eligible = False
|
||||
return eligible
|
||||
|
||||
def get_max_eligible(self) -> Optional[VersionData]:
|
||||
"""Find the eligible version with the highest version number."""
|
||||
max_eligible = None
|
||||
for ver in self.all_versions:
|
||||
if not self._check_eligible(ver):
|
||||
continue
|
||||
elif max_eligible is None:
|
||||
max_eligible = ver
|
||||
elif ver.version > max_eligible.version:
|
||||
max_eligible = ver
|
||||
return max_eligible
|
||||
|
||||
def async_check_for_update(
|
||||
self, callback: Callable = None, create_notifications: bool = False):
|
||||
"""Start a background thread which will check for updates."""
|
||||
|
||||
if self.is_checking:
|
||||
return
|
||||
|
||||
self._check_thread = threading.Thread(
|
||||
target=self.check_for_update,
|
||||
args=(callback, create_notifications))
|
||||
|
||||
self._check_thread.daemon = True
|
||||
self._check_thread.start()
|
||||
@@ -0,0 +1,498 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime
|
||||
|
||||
from typing import Any, Dict, Optional, Tuple
|
||||
|
||||
from .plan_manager import (PoliigonPlanUpgradeManager,
|
||||
PlanUpgradeStatus,
|
||||
PoliigonSubscription)
|
||||
from .multilingual import _t
|
||||
|
||||
|
||||
@dataclass
|
||||
class UpgradeIcons:
|
||||
check: str
|
||||
info: str
|
||||
unlimited: str
|
||||
|
||||
|
||||
class UpgradeContent:
|
||||
"""Class to be created in the DCC side which will define and store all
|
||||
UI content information."""
|
||||
|
||||
upgrade_manager: PoliigonPlanUpgradeManager
|
||||
current_plan: PoliigonSubscription
|
||||
|
||||
banner_primary_text: str = ""
|
||||
banner_secondary_text: str = ""
|
||||
banner_button_text: str = ""
|
||||
allow_dismiss: bool = False
|
||||
open_popup: bool = False
|
||||
icon_path: Optional[str] = None
|
||||
|
||||
# For upgrade popup
|
||||
upgrade_popup_title: Optional[str] = None
|
||||
upgrade_popup_table: Optional[Dict[str, str]] = None
|
||||
upgrade_popup_key_value: Optional[Dict[str, str]] = None
|
||||
upgrade_popup_text: Optional[str] = None
|
||||
upgrade_popup_confirm_button: Optional[str] = None
|
||||
upgrade_popup_pricing_button: Optional[str] = None
|
||||
upgrade_popup_terms_button: Optional[str] = None
|
||||
|
||||
# Upgrading process messages
|
||||
upgrading_primary_text: Optional[str] = None
|
||||
upgrading_secondary_text: Optional[str] = None
|
||||
|
||||
# For success popups
|
||||
success_popup_title: Optional[str] = None
|
||||
success_popup_text: Optional[str] = None
|
||||
|
||||
# For error popups
|
||||
error_popup_title: Optional[str] = None
|
||||
error_popup_text: Optional[str] = None
|
||||
|
||||
# Flags to be used for P4B UI
|
||||
as_single_paragraph: bool = False
|
||||
|
||||
# Flag to use in P4B & P4C UI
|
||||
use_single_policy_link: bool = True
|
||||
|
||||
# Stores the icon paths for each dcc
|
||||
icons: Optional[UpgradeIcons] = None
|
||||
|
||||
def __init__(self,
|
||||
upgrade_manager: PoliigonPlanUpgradeManager,
|
||||
as_single_paragraph: bool = False,
|
||||
use_single_policy_link: bool = True,
|
||||
icons: Optional[Tuple[str, str, str]] = None):
|
||||
"""Class to handle all the Content for Upgrade UI in each DCC.
|
||||
|
||||
Parameters:
|
||||
upgrade_manager: addon.upgrade_manager instance of PoliigonPlanUpgradeManager;
|
||||
as_single_paragraph: If True, the banner_secondary_text will be set as
|
||||
None, and all the text will be represented as
|
||||
a single paragraph in banner_primary_text;
|
||||
icons: The icon paths to be used in upgrade manager (order: check, info, unlimited)
|
||||
|
||||
NOTE: This class instance should be created in the addon side and
|
||||
stored in addon.upgrade_manager.content;
|
||||
"""
|
||||
|
||||
self.as_single_paragraph = as_single_paragraph
|
||||
self.use_single_policy_link = use_single_policy_link
|
||||
if icons is not None:
|
||||
# Icon paths order: check, info, unlimited
|
||||
self.icons = UpgradeIcons(icons[0], icons[1], icons[2])
|
||||
|
||||
self.refresh(upgrade_manager)
|
||||
|
||||
def refresh(self,
|
||||
upgrade_manager: PoliigonPlanUpgradeManager,
|
||||
only_resume_popup: bool = False
|
||||
) -> None:
|
||||
if upgrade_manager is not None:
|
||||
self.upgrade_manager = upgrade_manager
|
||||
if self.upgrade_manager is None:
|
||||
return
|
||||
if self.upgrade_manager.addon.user is None:
|
||||
return
|
||||
|
||||
self.current_plan = self.upgrade_manager.addon.user.plan
|
||||
|
||||
if only_resume_popup:
|
||||
# When resuming, the renewal date is updated, so in this
|
||||
# scenario we use this flag to only update the popup. If populate()
|
||||
# was called, now the status would be NO_UPGRADE_AVAILABLE and
|
||||
# the popup message will be another
|
||||
self.set_resume_success_popup()
|
||||
return
|
||||
|
||||
self.populate()
|
||||
|
||||
def student_discount(self, is_teacher: bool = False) -> Any:
|
||||
primary = _t("Access the entire library by joining Pro")
|
||||
secondary = _t("{0} can claim a 50% discount".format(
|
||||
_t("Students") if not is_teacher else _t("Teachers")))
|
||||
if self.as_single_paragraph:
|
||||
self.banner_primary_text = f"{primary}. {secondary}"
|
||||
self.banner_secondary_text = None
|
||||
else:
|
||||
self.banner_primary_text = primary
|
||||
self.banner_secondary_text = secondary
|
||||
self.banner_button_text = _t("Choose Your Plan")
|
||||
self.allow_dismiss = False
|
||||
self.open_popup = False
|
||||
if self.icons is not None:
|
||||
self.icon_path = self.icons.check
|
||||
|
||||
def become_pro(self) -> Any:
|
||||
primary = _t("Access the entire library by joining Pro")
|
||||
secondary = _t("Download and import from the entire Poliigon library")
|
||||
if self.as_single_paragraph:
|
||||
# To keep it slimmer, do just the primary text.
|
||||
self.banner_primary_text = f"{primary}"
|
||||
self.banner_secondary_text = None
|
||||
else:
|
||||
self.banner_primary_text = primary
|
||||
self.banner_secondary_text = secondary
|
||||
self.banner_button_text = _t("Choose Your Plan")
|
||||
self.allow_dismiss = False
|
||||
self.open_popup = False
|
||||
if self.icons is not None:
|
||||
self.icon_path = self.icons.check
|
||||
|
||||
def upgrade_balance(self) -> Any:
|
||||
primary_text, secondary_text = self._get_upgrade_text_upgrade_balance()
|
||||
|
||||
self.banner_primary_text = primary_text
|
||||
self.banner_secondary_text = secondary_text
|
||||
self.banner_button_text = _t("Get More Downloads")
|
||||
self.allow_dismiss = False
|
||||
self.open_popup = True
|
||||
|
||||
if self.icons is not None:
|
||||
self.icon_path = self.icons.info
|
||||
|
||||
def resume_plan(self) -> Any:
|
||||
primary_text, secondary_text = self._get_upgrade_text_paused_until()
|
||||
|
||||
self.banner_primary_text = primary_text
|
||||
self.banner_secondary_text = secondary_text
|
||||
self.banner_button_text = _t("Resume Plan")
|
||||
self.allow_dismiss = False
|
||||
self.open_popup = True
|
||||
|
||||
if self.icons is not None:
|
||||
self.icon_path = self.icons.info
|
||||
|
||||
def remove_cancel(self) -> Any:
|
||||
primary_text, secondary_text = self._get_upgrade_text_term_end()
|
||||
|
||||
self.banner_primary_text = primary_text
|
||||
self.banner_secondary_text = secondary_text
|
||||
self.banner_button_text = _t("Resume Plan")
|
||||
self.allow_dismiss = False
|
||||
self.open_popup = True
|
||||
|
||||
if self.icons is not None:
|
||||
self.icon_path = self.icons.info
|
||||
|
||||
def remove_pause(self) -> Any:
|
||||
primary_text, secondary_text = self._get_upgrade_text_paused_at()
|
||||
|
||||
self.banner_primary_text = primary_text
|
||||
self.banner_secondary_text = secondary_text
|
||||
self.banner_button_text = _t("Cancel Pause")
|
||||
self.allow_dismiss = False
|
||||
self.open_popup = True
|
||||
|
||||
if self.icons is not None:
|
||||
self.icon_path = self.icons.info
|
||||
|
||||
def upgrade_unlimited(self) -> Any:
|
||||
primary_text, secondary_text = self._get_upgrade_text_unlimited()
|
||||
|
||||
self.banner_primary_text = primary_text
|
||||
self.banner_secondary_text = secondary_text
|
||||
self.banner_button_text = _t("Upgrade to Unlimited")
|
||||
self.allow_dismiss = True
|
||||
self.open_popup = True
|
||||
|
||||
if self.icons is not None:
|
||||
self.icon_path = self.icons.unlimited
|
||||
return self
|
||||
|
||||
def _get_upgrade_text_upgrade_balance(self) -> Tuple[str, Optional[str]]:
|
||||
"""Returns text to display in case of a scheduled pause subscription."""
|
||||
next_renewal_date = self.current_plan.next_subscription_renewal_date
|
||||
diff = next_renewal_date - datetime.now()
|
||||
|
||||
head = _t("You’re out of downloads")
|
||||
text = _t("You’ll get more in {0} days or upgrade "
|
||||
"to download now").format(diff.days)
|
||||
if self.as_single_paragraph:
|
||||
return f"{head}. {text}", None
|
||||
else:
|
||||
return head, text
|
||||
|
||||
def _get_upgrade_text_paused_at(self) -> Tuple[str, Optional[str]]:
|
||||
"""Returns text to display in case of a scheduled pause subscription."""
|
||||
|
||||
pause_date = self.current_plan.plan_paused_at
|
||||
date_paused_until = None
|
||||
if pause_date is not None:
|
||||
date_paused_until = self.current_plan.plan_paused_at.strftime("%d %b %Y")
|
||||
|
||||
head = _t("Your plan will pause on {0}").format(date_paused_until)
|
||||
text = _t("Cancel pause to keep downloading")
|
||||
if self.as_single_paragraph:
|
||||
return f"{head}. {text}", None
|
||||
else:
|
||||
return head, text
|
||||
|
||||
def _get_upgrade_text_paused_until(self) -> Tuple[str, Optional[str]]:
|
||||
"""Returns text to display in case of a paused subscription."""
|
||||
|
||||
date_paused_until = self.current_plan.plan_paused_until.strftime("%d %b %Y")
|
||||
head = _t("Your plan is paused until {0}").format(date_paused_until)
|
||||
text = _t("Resume your plan to download new assets")
|
||||
if self.as_single_paragraph:
|
||||
return f"{head}. {text}", None
|
||||
else:
|
||||
return head, text
|
||||
|
||||
def _get_upgrade_text_term_end(self) -> Tuple[str, Optional[str]]:
|
||||
"""Returns text to display in case of a cancelled subscription."""
|
||||
|
||||
date_term_end = self.current_plan.current_term_end.strftime("%d %b %Y")
|
||||
head = _t("Your plan will end on {0}").format(date_term_end)
|
||||
text = _t("Resume your plan to keep downloading")
|
||||
if self.as_single_paragraph:
|
||||
return f"{head}. {text}", None
|
||||
else:
|
||||
return head, text
|
||||
|
||||
def _get_upgrade_text_unlimited(self) -> Tuple[str, Optional[str]]:
|
||||
"""Returns text to display in case of non-unlimited subscription."""
|
||||
|
||||
head = _t("Need more downloads?")
|
||||
text = _t("Upgrade to unlimited and never worry about limits again")
|
||||
if self.as_single_paragraph:
|
||||
# Per Blender design, don't include the second bit of text
|
||||
return f"{head}", None
|
||||
else:
|
||||
return head, text
|
||||
|
||||
def _get_text_price_change(self) -> str:
|
||||
price_old = self.current_plan.base_price
|
||||
price_new = self.upgrade_manager.upgrade_plan.base_price
|
||||
|
||||
currency_code = self.upgrade_manager.upgrade_info.currency_code
|
||||
currency_symbol = self.upgrade_manager.upgrade_info.currency_symbol
|
||||
|
||||
if price_old is not None:
|
||||
price_old = f"{currency_symbol}{price_old:.2f} {currency_code}"
|
||||
if price_new is not None:
|
||||
price_new = f"{currency_symbol}{price_new:.2f} {currency_code}"
|
||||
|
||||
return f"{price_old} \u2192 {price_new}"
|
||||
|
||||
def _get_text_licence(self) -> str:
|
||||
"""Decodes boolean has_team into 'Team' or 'Individual'."""
|
||||
|
||||
has_team = self.upgrade_manager.upgrade_plan.has_team
|
||||
if has_team:
|
||||
text_licence = _t("Team")
|
||||
else:
|
||||
text_licence = _t("Individual")
|
||||
return text_licence
|
||||
|
||||
def _get_text_billing_period(self) -> str:
|
||||
"""Decodes period_unit into 'Yearly' or 'Monthly'."""
|
||||
|
||||
period_unit = self.upgrade_manager.upgrade_plan.period_unit
|
||||
if period_unit == "year":
|
||||
text_billing = _t("Yearly")
|
||||
elif period_unit == "month":
|
||||
text_billing = _t("Monthly")
|
||||
else:
|
||||
text_billing = period_unit
|
||||
return text_billing
|
||||
|
||||
def _get_text_assets_change(self) -> str:
|
||||
"""Returns the change in assets count as string ('previous -> new')."""
|
||||
|
||||
new_assets = self.upgrade_manager.upgrade_info.new_assets
|
||||
prev_assets = self.upgrade_manager.upgrade_info.previous_assets
|
||||
text_assets = f"{prev_assets} \u2192 {new_assets}"
|
||||
return text_assets
|
||||
|
||||
def _get_text_users_change(self) -> str:
|
||||
"""Returns the change in user count as string ('previous -> new')."""
|
||||
new_users = self.upgrade_manager.upgrade_info.new_users
|
||||
previous_users = self.upgrade_manager.upgrade_info.previous_users
|
||||
text_users = f"{previous_users} \u2192 {new_users}"
|
||||
return text_users
|
||||
|
||||
def _get_text_amount_due(self) -> str:
|
||||
"""Returns amount due as string with currency code and symbol."""
|
||||
|
||||
amount_due = self.upgrade_manager.upgrade_info.amount_due
|
||||
currency_code = self.upgrade_manager.upgrade_info.currency_code
|
||||
currency_symbol = self.upgrade_manager.upgrade_info.currency_symbol
|
||||
text_amount_due = f"{currency_symbol}{amount_due} {currency_code}"
|
||||
return text_amount_due
|
||||
|
||||
def _get_text_amount_due_renewal(self) -> str:
|
||||
"""Returns amount due on renewal as string with
|
||||
currency code and symbol.
|
||||
"""
|
||||
|
||||
amount_due_renewal = self.upgrade_manager.upgrade_info.amount_due_renewal
|
||||
currency_code = self.upgrade_manager.upgrade_info.currency_code
|
||||
currency_symbol = self.upgrade_manager.upgrade_info.currency_symbol
|
||||
text_amount_due_renewal = (f"{currency_symbol}{amount_due_renewal} "
|
||||
f"{currency_code}")
|
||||
return text_amount_due_renewal
|
||||
|
||||
def set_resume_popup_information(self):
|
||||
# TODO: Check string phrasing here (Maybe different texts for each one of the scenarios of resuming)
|
||||
self.upgrade_popup_text = _t("Would you like to resume your plan? "
|
||||
"You will be charged for renewal and can "
|
||||
"start downloading straight away.")
|
||||
self.upgrade_popup_title = _t("Resume Plan")
|
||||
self.upgrade_popup_confirm_button = _t("Resume Plan")
|
||||
|
||||
self.upgrading_primary_text = _t("Resuming plan...")
|
||||
self.upgrading_secondary_text = _t("This may take a few seconds.")
|
||||
self.upgrade_popup_table = None
|
||||
self.upgrade_popup_key_value = None
|
||||
self.upgrade_popup_pricing_button = None
|
||||
self.upgrade_popup_terms_button = None
|
||||
|
||||
def set_remove_scheduled_pause_popup_information(self):
|
||||
# TODO: Check string phrasing here (Maybe different texts for each one of the scenarios of resuming)
|
||||
self.upgrade_popup_text = _t("Would you like to remove the scheduled pause? ")
|
||||
self.upgrade_popup_title = _t("Cancel Pause")
|
||||
self.upgrade_popup_confirm_button = _t("Cancel Pause")
|
||||
|
||||
self.upgrading_primary_text = _t("Cancelling Pause...")
|
||||
self.upgrading_secondary_text = _t("This may take a few seconds.")
|
||||
self.upgrade_popup_table = None
|
||||
self.upgrade_popup_key_value = None
|
||||
self.upgrade_popup_pricing_button = None
|
||||
self.upgrade_popup_terms_button = None
|
||||
|
||||
def set_remove_scheduled_cancel_popup_information(self):
|
||||
# TODO: Check string phrasing here (Maybe different texts for each one of the scenarios of resuming)
|
||||
self.upgrade_popup_text = _t("Would you like to remove the scheduled cancellation?")
|
||||
self.upgrade_popup_title = _t("Remove Cancellation")
|
||||
self.upgrade_popup_confirm_button = _t("Remove Cancellation")
|
||||
|
||||
self.upgrading_primary_text = _t("Removing Cancellation...")
|
||||
self.upgrading_secondary_text = _t("This may take a few seconds.")
|
||||
self.upgrade_popup_table = None
|
||||
self.upgrade_popup_key_value = None
|
||||
self.upgrade_popup_pricing_button = None
|
||||
self.upgrade_popup_terms_button = None
|
||||
|
||||
def set_upgrade_popup_information(self):
|
||||
self.upgrade_popup_table = {
|
||||
_t("Assets per month:"): self._get_text_assets_change(),
|
||||
|
||||
# The following line are commented due to a decision of not showing
|
||||
# team related information on confirmation popup;
|
||||
# _t("Users:"): self._get_text_users_change(),
|
||||
# _t("License:"): self._get_text_licence(),
|
||||
|
||||
_t(f"{self._get_text_billing_period()} price:"): self._get_text_price_change(),
|
||||
_t("Starts:"): _t("Today"),
|
||||
_t("Billing frequency:"): self._get_text_billing_period(),
|
||||
_t("Renewal:"): self.upgrade_manager.upgrade_info.renewal_date
|
||||
}
|
||||
|
||||
self.upgrade_popup_key_value = {
|
||||
_t("Due today:"): self._get_text_amount_due(),
|
||||
_t("Due on renewal:"): self._get_text_amount_due_renewal()
|
||||
}
|
||||
|
||||
if self.use_single_policy_link:
|
||||
confirm_text = _t("By confirming you agree to our Unlimited Fair Use "
|
||||
"Policy, Terms & Conditions, Privacy & Refund Policy below.")
|
||||
else:
|
||||
confirm_text = _t("By confirming you agree to our Unlimited Fair Use "
|
||||
"Policy, Terms & Conditions, Privacy & Refund Policy.")
|
||||
|
||||
tax_text = ""
|
||||
if self.upgrade_manager.upgrade_info.tax_rate not in [None, 0]:
|
||||
tax_text = _t("Due today and renewal prices include {0}% tax. ").format(
|
||||
self.upgrade_manager.upgrade_info.tax_rate)
|
||||
self.upgrade_popup_text = f"{tax_text}{confirm_text}"
|
||||
|
||||
self.upgrade_popup_title = _t("Change Plan")
|
||||
self.upgrade_popup_confirm_button = _t("Confirm Plan Change")
|
||||
self.upgrade_popup_pricing_button = _t("View All Pricing")
|
||||
self.upgrade_popup_terms_button = _t("Terms & Policy Documents")
|
||||
|
||||
def set_upgrade_success_popup(self):
|
||||
self.success_popup_title = _t("Plan Change Successful")
|
||||
self.success_popup_text = _t("You have successfully updated your plan.")
|
||||
|
||||
# TODO(Joao): Implement different hard coded error messages for each
|
||||
# scenario - look for strings in the api error (maybe error codes)
|
||||
|
||||
self.error_popup_title = _t("Error Upgrading Plan")
|
||||
self.error_popup_text = _t("Upgrade Plan Failed. \n\n{0}\n\n"
|
||||
"Try again later or reach out to support.")
|
||||
|
||||
self.upgrading_primary_text = _t("Upgrading Plan...")
|
||||
self.upgrading_secondary_text = _t("This may take a few seconds.")
|
||||
|
||||
def set_resume_success_popup(self):
|
||||
# TODO: Check string phrasing here (Just a mock str for now)
|
||||
renewal_date = self.current_plan.next_subscription_renewal_date
|
||||
text = _t("Your plan has successfully resumed")
|
||||
if renewal_date is not None:
|
||||
renewal_text = self.current_plan.next_subscription_renewal_date.strftime("%d %b %Y")
|
||||
renewal_date_text = _t("and will renew on {0}").format(renewal_text)
|
||||
text = f"{text} {renewal_date_text}"
|
||||
else:
|
||||
text = f"{text}."
|
||||
|
||||
self.success_popup_title = _t("Plan Resumed")
|
||||
self.success_popup_text = text
|
||||
|
||||
self.error_popup_title = _t("Error Resuming Plan")
|
||||
self.error_popup_text = _t("Resume Plan Failed. \n\n{0}\n\n"
|
||||
"Try again later or reach out to support.")
|
||||
|
||||
def populate(self) -> None:
|
||||
upgrade_satus = self.upgrade_manager.status
|
||||
if upgrade_satus == PlanUpgradeStatus.STUDENT_DISCOUNT:
|
||||
self.student_discount()
|
||||
if upgrade_satus == PlanUpgradeStatus.TEACHER_DISCOUNT:
|
||||
self.student_discount(is_teacher=True)
|
||||
elif upgrade_satus == PlanUpgradeStatus.BECOME_PRO:
|
||||
self.become_pro()
|
||||
elif upgrade_satus == PlanUpgradeStatus.UPGRADE_PLAN_BALANCE:
|
||||
self.upgrade_balance()
|
||||
if self.upgrade_manager.upgrade_info is None:
|
||||
return
|
||||
self.set_upgrade_popup_information()
|
||||
self.set_upgrade_success_popup()
|
||||
elif upgrade_satus == PlanUpgradeStatus.RESUME_PLAN:
|
||||
self.resume_plan()
|
||||
self.set_resume_popup_information()
|
||||
self.set_resume_success_popup()
|
||||
elif upgrade_satus == PlanUpgradeStatus.REMOVE_SCHEDULED_PAUSE:
|
||||
self.remove_pause()
|
||||
self.set_remove_scheduled_pause_popup_information()
|
||||
self.set_resume_success_popup()
|
||||
elif upgrade_satus == PlanUpgradeStatus.REMOVE_CANCELLATION:
|
||||
self.remove_cancel()
|
||||
self.set_remove_scheduled_cancel_popup_information()
|
||||
self.set_resume_success_popup()
|
||||
elif upgrade_satus == PlanUpgradeStatus.UPGRADE_PLAN_UNLIMITED:
|
||||
self.upgrade_unlimited()
|
||||
if self.upgrade_manager.upgrade_info is None:
|
||||
return
|
||||
self.set_upgrade_popup_information()
|
||||
self.set_upgrade_success_popup()
|
||||
@@ -0,0 +1,129 @@
|
||||
# #### BEGIN GPL LICENSE BLOCK #####
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or
|
||||
# modify it under the terms of the GNU General Public License
|
||||
# as published by the Free Software Foundation; either version 2
|
||||
# of the License, or (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
#
|
||||
# ##### END GPL LICENSE BLOCK #####
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import Dict, List, Optional
|
||||
|
||||
from .assets import (MapType)
|
||||
from .plan_manager import PoliigonSubscription
|
||||
|
||||
from .logger import (DEBUG, # noqa F401, allowing downstream const usage
|
||||
ERROR,
|
||||
INFO,
|
||||
get_addon_logger,
|
||||
NOT_SET,
|
||||
WARNING)
|
||||
|
||||
|
||||
@dataclass
|
||||
class MapFormats:
|
||||
map_type: MapType
|
||||
default: str
|
||||
required: bool
|
||||
extensions: Dict[str, bool]
|
||||
enabled: bool
|
||||
selected: Optional[str] = None
|
||||
|
||||
|
||||
class UserDownloadPreferences:
|
||||
resolution_options: List[str]
|
||||
default_resolution: str
|
||||
selected_resolution: Optional[str] = None
|
||||
|
||||
texture_maps: List[MapFormats]
|
||||
|
||||
software_selected: Optional[str] = None
|
||||
render_engine_selected: Optional[str] = None
|
||||
|
||||
lod_options: List[str]
|
||||
lod_selected: Optional[str] = None
|
||||
|
||||
def __init__(self, res: Dict):
|
||||
self.res = res
|
||||
|
||||
self.set_resolution()
|
||||
self.set_lods()
|
||||
self.set_software()
|
||||
self.set_texture_maps()
|
||||
|
||||
def set_resolution(self) -> None:
|
||||
resolution_info = self.res.get("default_resolution", {})
|
||||
self.resolution_options = resolution_info.get("resolution_options")
|
||||
self.default_resolution = resolution_info.get("default")
|
||||
self.selected_resolution = resolution_info.get("selected")
|
||||
|
||||
def set_lods(self) -> None:
|
||||
lods_info = self.res.get("lods", {})
|
||||
self.lod_options = lods_info.get("lod_options")
|
||||
self.lod_selected = lods_info.get("selected")
|
||||
|
||||
def set_software(self) -> None:
|
||||
software_info = self.res.get("softwares", {})
|
||||
for _soft, soft_inf in software_info.items():
|
||||
soft_selected = soft_inf.get("selected", None)
|
||||
renderer_selected = soft_inf.get("selected_render_engine", None)
|
||||
if soft_selected is not None and renderer_selected is not None:
|
||||
self.software_selected = soft_selected
|
||||
self.render_engine_selected = renderer_selected
|
||||
break
|
||||
|
||||
def set_texture_maps(self) -> None:
|
||||
self.texture_maps = []
|
||||
texture_maps_info = self.res.get("texture_maps", {})
|
||||
for _map, map_info in texture_maps_info.items():
|
||||
map_type = MapType.from_type_code(_map)
|
||||
enabled = map_info.get("selected") is not None
|
||||
map_format = MapFormats(map_type=map_type,
|
||||
default=map_info.get("default"),
|
||||
enabled=enabled,
|
||||
selected=map_info.get("selected"),
|
||||
required=map_info.get("required"),
|
||||
extensions=map_info.get("formats"))
|
||||
|
||||
self.texture_maps.append(map_format)
|
||||
|
||||
def string_stamp(self) -> str:
|
||||
string_stamp = ""
|
||||
for _map in self.texture_maps:
|
||||
string_stamp += f"{_map.map_type.name}:{str(_map.selected)};"
|
||||
return string_stamp
|
||||
|
||||
def get_map_preferences(self, map_type: MapType) -> Optional[MapFormats]:
|
||||
for _map in self.texture_maps:
|
||||
if _map.map_type.get_effective() == map_type.get_effective():
|
||||
return _map
|
||||
return None
|
||||
|
||||
def get_all_maps_enabled(self):
|
||||
return [_map for _map in self.texture_maps if _map.enabled]
|
||||
|
||||
|
||||
@dataclass
|
||||
class PoliigonUser:
|
||||
"""Container object for a user."""
|
||||
|
||||
user_name: str
|
||||
user_id: int
|
||||
is_student: Optional[bool] = False
|
||||
is_teacher: Optional[bool] = False
|
||||
credits: Optional[int] = None
|
||||
credits_od: Optional[int] = None
|
||||
plan: Optional[PoliigonSubscription] = None
|
||||
map_preferences: Optional[UserDownloadPreferences] = None
|
||||
# Todo(Joao): remove this flag when all addons are using map prefs
|
||||
use_preferences_on_download: bool = False
|
||||
@@ -0,0 +1,57 @@
|
||||
from sentry_sdk.scope import Scope
|
||||
from sentry_sdk.transport import Transport, HttpTransport
|
||||
from sentry_sdk.client import Client
|
||||
|
||||
from sentry_sdk.api import * # noqa
|
||||
|
||||
from sentry_sdk.consts import VERSION # noqa
|
||||
|
||||
__all__ = [ # noqa
|
||||
"Hub",
|
||||
"Scope",
|
||||
"Client",
|
||||
"Transport",
|
||||
"HttpTransport",
|
||||
"integrations",
|
||||
# From sentry_sdk.api
|
||||
"init",
|
||||
"add_breadcrumb",
|
||||
"capture_event",
|
||||
"capture_exception",
|
||||
"capture_message",
|
||||
"configure_scope",
|
||||
"continue_trace",
|
||||
"flush",
|
||||
"get_baggage",
|
||||
"get_client",
|
||||
"get_global_scope",
|
||||
"get_isolation_scope",
|
||||
"get_current_scope",
|
||||
"get_current_span",
|
||||
"get_traceparent",
|
||||
"is_initialized",
|
||||
"isolation_scope",
|
||||
"last_event_id",
|
||||
"new_scope",
|
||||
"push_scope",
|
||||
"set_context",
|
||||
"set_extra",
|
||||
"set_level",
|
||||
"set_measurement",
|
||||
"set_tag",
|
||||
"set_tags",
|
||||
"set_user",
|
||||
"start_span",
|
||||
"start_transaction",
|
||||
"trace",
|
||||
"monitor",
|
||||
]
|
||||
|
||||
# Initialize the debug support after everything is loaded
|
||||
from sentry_sdk.debug import init_debug_support
|
||||
|
||||
init_debug_support()
|
||||
del init_debug_support
|
||||
|
||||
# circular imports
|
||||
from sentry_sdk.hub import Hub
|
||||
@@ -0,0 +1,98 @@
|
||||
import sys
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Any
|
||||
from typing import TypeVar
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
|
||||
PY37 = sys.version_info[0] == 3 and sys.version_info[1] >= 7
|
||||
PY38 = sys.version_info[0] == 3 and sys.version_info[1] >= 8
|
||||
PY310 = sys.version_info[0] == 3 and sys.version_info[1] >= 10
|
||||
PY311 = sys.version_info[0] == 3 and sys.version_info[1] >= 11
|
||||
|
||||
|
||||
def with_metaclass(meta, *bases):
|
||||
# type: (Any, *Any) -> Any
|
||||
class MetaClass(type):
|
||||
def __new__(metacls, name, this_bases, d):
|
||||
# type: (Any, Any, Any, Any) -> Any
|
||||
return meta(name, bases, d)
|
||||
|
||||
return type.__new__(MetaClass, "temporary_class", (), {})
|
||||
|
||||
|
||||
def check_uwsgi_thread_support():
|
||||
# type: () -> bool
|
||||
# We check two things here:
|
||||
#
|
||||
# 1. uWSGI doesn't run in threaded mode by default -- issue a warning if
|
||||
# that's the case.
|
||||
#
|
||||
# 2. Additionally, if uWSGI is running in preforking mode (default), it needs
|
||||
# the --py-call-uwsgi-fork-hooks option for the SDK to work properly. This
|
||||
# is because any background threads spawned before the main process is
|
||||
# forked are NOT CLEANED UP IN THE CHILDREN BY DEFAULT even if
|
||||
# --enable-threads is on. One has to explicitly provide
|
||||
# --py-call-uwsgi-fork-hooks to force uWSGI to run regular cpython
|
||||
# after-fork hooks that take care of cleaning up stale thread data.
|
||||
try:
|
||||
from uwsgi import opt # type: ignore
|
||||
except ImportError:
|
||||
return True
|
||||
|
||||
from sentry_sdk.consts import FALSE_VALUES
|
||||
|
||||
def enabled(option):
|
||||
# type: (str) -> bool
|
||||
value = opt.get(option, False)
|
||||
if isinstance(value, bool):
|
||||
return value
|
||||
|
||||
if isinstance(value, bytes):
|
||||
try:
|
||||
value = value.decode()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return value and str(value).lower() not in FALSE_VALUES
|
||||
|
||||
# When `threads` is passed in as a uwsgi option,
|
||||
# `enable-threads` is implied on.
|
||||
threads_enabled = "threads" in opt or enabled("enable-threads")
|
||||
fork_hooks_on = enabled("py-call-uwsgi-fork-hooks")
|
||||
lazy_mode = enabled("lazy-apps") or enabled("lazy")
|
||||
|
||||
if lazy_mode and not threads_enabled:
|
||||
from warnings import warn
|
||||
|
||||
warn(
|
||||
Warning(
|
||||
"IMPORTANT: "
|
||||
"We detected the use of uWSGI without thread support. "
|
||||
"This might lead to unexpected issues. "
|
||||
'Please run uWSGI with "--enable-threads" for full support.'
|
||||
)
|
||||
)
|
||||
|
||||
return False
|
||||
|
||||
elif not lazy_mode and (not threads_enabled or not fork_hooks_on):
|
||||
from warnings import warn
|
||||
|
||||
warn(
|
||||
Warning(
|
||||
"IMPORTANT: "
|
||||
"We detected the use of uWSGI in preforking mode without "
|
||||
"thread support. This might lead to crashing workers. "
|
||||
'Please run uWSGI with both "--enable-threads" and '
|
||||
'"--py-call-uwsgi-fork-hooks" for full support.'
|
||||
)
|
||||
)
|
||||
|
||||
return False
|
||||
|
||||
return True
|
||||
@@ -0,0 +1,84 @@
|
||||
import warnings
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
import sentry_sdk
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Any, ContextManager, Optional
|
||||
|
||||
import sentry_sdk.consts
|
||||
|
||||
|
||||
class _InitGuard:
|
||||
_CONTEXT_MANAGER_DEPRECATION_WARNING_MESSAGE = (
|
||||
"Using the return value of sentry_sdk.init as a context manager "
|
||||
"and manually calling the __enter__ and __exit__ methods on the "
|
||||
"return value are deprecated. We are no longer maintaining this "
|
||||
"functionality, and we will remove it in the next major release."
|
||||
)
|
||||
|
||||
def __init__(self, client):
|
||||
# type: (sentry_sdk.Client) -> None
|
||||
self._client = client
|
||||
|
||||
def __enter__(self):
|
||||
# type: () -> _InitGuard
|
||||
warnings.warn(
|
||||
self._CONTEXT_MANAGER_DEPRECATION_WARNING_MESSAGE,
|
||||
stacklevel=2,
|
||||
category=DeprecationWarning,
|
||||
)
|
||||
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_value, tb):
|
||||
# type: (Any, Any, Any) -> None
|
||||
warnings.warn(
|
||||
self._CONTEXT_MANAGER_DEPRECATION_WARNING_MESSAGE,
|
||||
stacklevel=2,
|
||||
category=DeprecationWarning,
|
||||
)
|
||||
|
||||
c = self._client
|
||||
if c is not None:
|
||||
c.close()
|
||||
|
||||
|
||||
def _check_python_deprecations():
|
||||
# type: () -> None
|
||||
# Since we're likely to deprecate Python versions in the future, I'm keeping
|
||||
# this handy function around. Use this to detect the Python version used and
|
||||
# to output logger.warning()s if it's deprecated.
|
||||
pass
|
||||
|
||||
|
||||
def _init(*args, **kwargs):
|
||||
# type: (*Optional[str], **Any) -> ContextManager[Any]
|
||||
"""Initializes the SDK and optionally integrations.
|
||||
|
||||
This takes the same arguments as the client constructor.
|
||||
"""
|
||||
client = sentry_sdk.Client(*args, **kwargs)
|
||||
sentry_sdk.get_global_scope().set_client(client)
|
||||
_check_python_deprecations()
|
||||
rv = _InitGuard(client)
|
||||
return rv
|
||||
|
||||
|
||||
if TYPE_CHECKING:
|
||||
# Make mypy, PyCharm and other static analyzers think `init` is a type to
|
||||
# have nicer autocompletion for params.
|
||||
#
|
||||
# Use `ClientConstructor` to define the argument types of `init` and
|
||||
# `ContextManager[Any]` to tell static analyzers about the return type.
|
||||
|
||||
class init(sentry_sdk.consts.ClientConstructor, _InitGuard): # noqa: N801
|
||||
pass
|
||||
|
||||
else:
|
||||
# Alias `init` for actual usage. Go through the lambda indirection to throw
|
||||
# PyCharm off of the weakly typed signature (it would otherwise discover
|
||||
# both the weakly typed signature of `_init` and our faked `init` type).
|
||||
|
||||
init = (lambda: _init)()
|
||||
@@ -0,0 +1,47 @@
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Any
|
||||
|
||||
|
||||
_SENTINEL = object()
|
||||
|
||||
|
||||
class LRUCache:
|
||||
def __init__(self, max_size):
|
||||
# type: (int) -> None
|
||||
if max_size <= 0:
|
||||
raise AssertionError(f"invalid max_size: {max_size}")
|
||||
self.max_size = max_size
|
||||
self._data = {} # type: dict[Any, Any]
|
||||
self.hits = self.misses = 0
|
||||
self.full = False
|
||||
|
||||
def set(self, key, value):
|
||||
# type: (Any, Any) -> None
|
||||
current = self._data.pop(key, _SENTINEL)
|
||||
if current is not _SENTINEL:
|
||||
self._data[key] = value
|
||||
elif self.full:
|
||||
self._data.pop(next(iter(self._data)))
|
||||
self._data[key] = value
|
||||
else:
|
||||
self._data[key] = value
|
||||
self.full = len(self._data) >= self.max_size
|
||||
|
||||
def get(self, key, default=None):
|
||||
# type: (Any, Any) -> Any
|
||||
try:
|
||||
ret = self._data.pop(key)
|
||||
except KeyError:
|
||||
self.misses += 1
|
||||
ret = default
|
||||
else:
|
||||
self.hits += 1
|
||||
self._data[key] = ret
|
||||
|
||||
return ret
|
||||
|
||||
def get_all(self):
|
||||
# type: () -> list[tuple[Any, Any]]
|
||||
return list(self._data.items())
|
||||
@@ -0,0 +1,289 @@
|
||||
"""
|
||||
A fork of Python 3.6's stdlib queue (found in Pythons 'cpython/Lib/queue.py')
|
||||
with Lock swapped out for RLock to avoid a deadlock while garbage collecting.
|
||||
|
||||
https://github.com/python/cpython/blob/v3.6.12/Lib/queue.py
|
||||
|
||||
|
||||
See also
|
||||
https://codewithoutrules.com/2017/08/16/concurrency-python/
|
||||
https://bugs.python.org/issue14976
|
||||
https://github.com/sqlalchemy/sqlalchemy/blob/4eb747b61f0c1b1c25bdee3856d7195d10a0c227/lib/sqlalchemy/queue.py#L1
|
||||
|
||||
We also vendor the code to evade eventlet's broken monkeypatching, see
|
||||
https://github.com/getsentry/sentry-python/pull/484
|
||||
|
||||
|
||||
Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
|
||||
2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Python Software Foundation;
|
||||
|
||||
All Rights Reserved
|
||||
|
||||
|
||||
PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
|
||||
--------------------------------------------
|
||||
|
||||
1. This LICENSE AGREEMENT is between the Python Software Foundation
|
||||
("PSF"), and the Individual or Organization ("Licensee") accessing and
|
||||
otherwise using this software ("Python") in source or binary form and
|
||||
its associated documentation.
|
||||
|
||||
2. Subject to the terms and conditions of this License Agreement, PSF hereby
|
||||
grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
|
||||
analyze, test, perform and/or display publicly, prepare derivative works,
|
||||
distribute, and otherwise use Python alone or in any derivative version,
|
||||
provided, however, that PSF's License Agreement and PSF's notice of copyright,
|
||||
i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
|
||||
2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Python Software Foundation;
|
||||
All Rights Reserved" are retained in Python alone or in any derivative version
|
||||
prepared by Licensee.
|
||||
|
||||
3. In the event Licensee prepares a derivative work that is based on
|
||||
or incorporates Python or any part thereof, and wants to make
|
||||
the derivative work available to others as provided herein, then
|
||||
Licensee hereby agrees to include in any such work a brief summary of
|
||||
the changes made to Python.
|
||||
|
||||
4. PSF is making Python available to Licensee on an "AS IS"
|
||||
basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
||||
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
|
||||
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
||||
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
|
||||
INFRINGE ANY THIRD PARTY RIGHTS.
|
||||
|
||||
5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
|
||||
FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
|
||||
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
|
||||
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
||||
|
||||
6. This License Agreement will automatically terminate upon a material
|
||||
breach of its terms and conditions.
|
||||
|
||||
7. Nothing in this License Agreement shall be deemed to create any
|
||||
relationship of agency, partnership, or joint venture between PSF and
|
||||
Licensee. This License Agreement does not grant permission to use PSF
|
||||
trademarks or trade name in a trademark sense to endorse or promote
|
||||
products or services of Licensee, or any third party.
|
||||
|
||||
8. By copying, installing or otherwise using Python, Licensee
|
||||
agrees to be bound by the terms and conditions of this License
|
||||
Agreement.
|
||||
|
||||
"""
|
||||
|
||||
import threading
|
||||
|
||||
from collections import deque
|
||||
from time import time
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Any
|
||||
|
||||
__all__ = ["EmptyError", "FullError", "Queue"]
|
||||
|
||||
|
||||
class EmptyError(Exception):
|
||||
"Exception raised by Queue.get(block=0)/get_nowait()."
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class FullError(Exception):
|
||||
"Exception raised by Queue.put(block=0)/put_nowait()."
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class Queue:
|
||||
"""Create a queue object with a given maximum size.
|
||||
|
||||
If maxsize is <= 0, the queue size is infinite.
|
||||
"""
|
||||
|
||||
def __init__(self, maxsize=0):
|
||||
self.maxsize = maxsize
|
||||
self._init(maxsize)
|
||||
|
||||
# mutex must be held whenever the queue is mutating. All methods
|
||||
# that acquire mutex must release it before returning. mutex
|
||||
# is shared between the three conditions, so acquiring and
|
||||
# releasing the conditions also acquires and releases mutex.
|
||||
self.mutex = threading.RLock()
|
||||
|
||||
# Notify not_empty whenever an item is added to the queue; a
|
||||
# thread waiting to get is notified then.
|
||||
self.not_empty = threading.Condition(self.mutex)
|
||||
|
||||
# Notify not_full whenever an item is removed from the queue;
|
||||
# a thread waiting to put is notified then.
|
||||
self.not_full = threading.Condition(self.mutex)
|
||||
|
||||
# Notify all_tasks_done whenever the number of unfinished tasks
|
||||
# drops to zero; thread waiting to join() is notified to resume
|
||||
self.all_tasks_done = threading.Condition(self.mutex)
|
||||
self.unfinished_tasks = 0
|
||||
|
||||
def task_done(self):
|
||||
"""Indicate that a formerly enqueued task is complete.
|
||||
|
||||
Used by Queue consumer threads. For each get() used to fetch a task,
|
||||
a subsequent call to task_done() tells the queue that the processing
|
||||
on the task is complete.
|
||||
|
||||
If a join() is currently blocking, it will resume when all items
|
||||
have been processed (meaning that a task_done() call was received
|
||||
for every item that had been put() into the queue).
|
||||
|
||||
Raises a ValueError if called more times than there were items
|
||||
placed in the queue.
|
||||
"""
|
||||
with self.all_tasks_done:
|
||||
unfinished = self.unfinished_tasks - 1
|
||||
if unfinished <= 0:
|
||||
if unfinished < 0:
|
||||
raise ValueError("task_done() called too many times")
|
||||
self.all_tasks_done.notify_all()
|
||||
self.unfinished_tasks = unfinished
|
||||
|
||||
def join(self):
|
||||
"""Blocks until all items in the Queue have been gotten and processed.
|
||||
|
||||
The count of unfinished tasks goes up whenever an item is added to the
|
||||
queue. The count goes down whenever a consumer thread calls task_done()
|
||||
to indicate the item was retrieved and all work on it is complete.
|
||||
|
||||
When the count of unfinished tasks drops to zero, join() unblocks.
|
||||
"""
|
||||
with self.all_tasks_done:
|
||||
while self.unfinished_tasks:
|
||||
self.all_tasks_done.wait()
|
||||
|
||||
def qsize(self):
|
||||
"""Return the approximate size of the queue (not reliable!)."""
|
||||
with self.mutex:
|
||||
return self._qsize()
|
||||
|
||||
def empty(self):
|
||||
"""Return True if the queue is empty, False otherwise (not reliable!).
|
||||
|
||||
This method is likely to be removed at some point. Use qsize() == 0
|
||||
as a direct substitute, but be aware that either approach risks a race
|
||||
condition where a queue can grow before the result of empty() or
|
||||
qsize() can be used.
|
||||
|
||||
To create code that needs to wait for all queued tasks to be
|
||||
completed, the preferred technique is to use the join() method.
|
||||
"""
|
||||
with self.mutex:
|
||||
return not self._qsize()
|
||||
|
||||
def full(self):
|
||||
"""Return True if the queue is full, False otherwise (not reliable!).
|
||||
|
||||
This method is likely to be removed at some point. Use qsize() >= n
|
||||
as a direct substitute, but be aware that either approach risks a race
|
||||
condition where a queue can shrink before the result of full() or
|
||||
qsize() can be used.
|
||||
"""
|
||||
with self.mutex:
|
||||
return 0 < self.maxsize <= self._qsize()
|
||||
|
||||
def put(self, item, block=True, timeout=None):
|
||||
"""Put an item into the queue.
|
||||
|
||||
If optional args 'block' is true and 'timeout' is None (the default),
|
||||
block if necessary until a free slot is available. If 'timeout' is
|
||||
a non-negative number, it blocks at most 'timeout' seconds and raises
|
||||
the FullError exception if no free slot was available within that time.
|
||||
Otherwise ('block' is false), put an item on the queue if a free slot
|
||||
is immediately available, else raise the FullError exception ('timeout'
|
||||
is ignored in that case).
|
||||
"""
|
||||
with self.not_full:
|
||||
if self.maxsize > 0:
|
||||
if not block:
|
||||
if self._qsize() >= self.maxsize:
|
||||
raise FullError()
|
||||
elif timeout is None:
|
||||
while self._qsize() >= self.maxsize:
|
||||
self.not_full.wait()
|
||||
elif timeout < 0:
|
||||
raise ValueError("'timeout' must be a non-negative number")
|
||||
else:
|
||||
endtime = time() + timeout
|
||||
while self._qsize() >= self.maxsize:
|
||||
remaining = endtime - time()
|
||||
if remaining <= 0.0:
|
||||
raise FullError()
|
||||
self.not_full.wait(remaining)
|
||||
self._put(item)
|
||||
self.unfinished_tasks += 1
|
||||
self.not_empty.notify()
|
||||
|
||||
def get(self, block=True, timeout=None):
|
||||
"""Remove and return an item from the queue.
|
||||
|
||||
If optional args 'block' is true and 'timeout' is None (the default),
|
||||
block if necessary until an item is available. If 'timeout' is
|
||||
a non-negative number, it blocks at most 'timeout' seconds and raises
|
||||
the EmptyError exception if no item was available within that time.
|
||||
Otherwise ('block' is false), return an item if one is immediately
|
||||
available, else raise the EmptyError exception ('timeout' is ignored
|
||||
in that case).
|
||||
"""
|
||||
with self.not_empty:
|
||||
if not block:
|
||||
if not self._qsize():
|
||||
raise EmptyError()
|
||||
elif timeout is None:
|
||||
while not self._qsize():
|
||||
self.not_empty.wait()
|
||||
elif timeout < 0:
|
||||
raise ValueError("'timeout' must be a non-negative number")
|
||||
else:
|
||||
endtime = time() + timeout
|
||||
while not self._qsize():
|
||||
remaining = endtime - time()
|
||||
if remaining <= 0.0:
|
||||
raise EmptyError()
|
||||
self.not_empty.wait(remaining)
|
||||
item = self._get()
|
||||
self.not_full.notify()
|
||||
return item
|
||||
|
||||
def put_nowait(self, item):
|
||||
"""Put an item into the queue without blocking.
|
||||
|
||||
Only enqueue the item if a free slot is immediately available.
|
||||
Otherwise raise the FullError exception.
|
||||
"""
|
||||
return self.put(item, block=False)
|
||||
|
||||
def get_nowait(self):
|
||||
"""Remove and return an item from the queue without blocking.
|
||||
|
||||
Only get an item if one is immediately available. Otherwise
|
||||
raise the EmptyError exception.
|
||||
"""
|
||||
return self.get(block=False)
|
||||
|
||||
# Override these methods to implement other queue organizations
|
||||
# (e.g. stack or priority queue).
|
||||
# These will only be called with appropriate locks held
|
||||
|
||||
# Initialize the queue representation
|
||||
def _init(self, maxsize):
|
||||
self.queue = deque() # type: Any
|
||||
|
||||
def _qsize(self):
|
||||
return len(self.queue)
|
||||
|
||||
# Put a new item in the queue
|
||||
def _put(self, item):
|
||||
self.queue.append(item)
|
||||
|
||||
# Get an item from the queue
|
||||
def _get(self):
|
||||
return self.queue.popleft()
|
||||
@@ -0,0 +1,300 @@
|
||||
from typing import TYPE_CHECKING, TypeVar, Union
|
||||
|
||||
|
||||
# Re-exported for compat, since code out there in the wild might use this variable.
|
||||
MYPY = TYPE_CHECKING
|
||||
|
||||
|
||||
SENSITIVE_DATA_SUBSTITUTE = "[Filtered]"
|
||||
|
||||
|
||||
class AnnotatedValue:
|
||||
"""
|
||||
Meta information for a data field in the event payload.
|
||||
This is to tell Relay that we have tampered with the fields value.
|
||||
See:
|
||||
https://github.com/getsentry/relay/blob/be12cd49a0f06ea932ed9b9f93a655de5d6ad6d1/relay-general/src/types/meta.rs#L407-L423
|
||||
"""
|
||||
|
||||
__slots__ = ("value", "metadata")
|
||||
|
||||
def __init__(self, value, metadata):
|
||||
# type: (Optional[Any], Dict[str, Any]) -> None
|
||||
self.value = value
|
||||
self.metadata = metadata
|
||||
|
||||
def __eq__(self, other):
|
||||
# type: (Any) -> bool
|
||||
if not isinstance(other, AnnotatedValue):
|
||||
return False
|
||||
|
||||
return self.value == other.value and self.metadata == other.metadata
|
||||
|
||||
@classmethod
|
||||
def removed_because_raw_data(cls):
|
||||
# type: () -> AnnotatedValue
|
||||
"""The value was removed because it could not be parsed. This is done for request body values that are not json nor a form."""
|
||||
return AnnotatedValue(
|
||||
value="",
|
||||
metadata={
|
||||
"rem": [ # Remark
|
||||
[
|
||||
"!raw", # Unparsable raw data
|
||||
"x", # The fields original value was removed
|
||||
]
|
||||
]
|
||||
},
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def removed_because_over_size_limit(cls):
|
||||
# type: () -> AnnotatedValue
|
||||
"""The actual value was removed because the size of the field exceeded the configured maximum size (specified with the max_request_body_size sdk option)"""
|
||||
return AnnotatedValue(
|
||||
value="",
|
||||
metadata={
|
||||
"rem": [ # Remark
|
||||
[
|
||||
"!config", # Because of configured maximum size
|
||||
"x", # The fields original value was removed
|
||||
]
|
||||
]
|
||||
},
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def substituted_because_contains_sensitive_data(cls):
|
||||
# type: () -> AnnotatedValue
|
||||
"""The actual value was removed because it contained sensitive information."""
|
||||
return AnnotatedValue(
|
||||
value=SENSITIVE_DATA_SUBSTITUTE,
|
||||
metadata={
|
||||
"rem": [ # Remark
|
||||
[
|
||||
"!config", # Because of SDK configuration (in this case the config is the hard coded removal of certain django cookies)
|
||||
"s", # The fields original value was substituted
|
||||
]
|
||||
]
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
T = TypeVar("T")
|
||||
Annotated = Union[AnnotatedValue, T]
|
||||
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import Container, MutableMapping, Sequence
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
from types import TracebackType
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import Dict
|
||||
from typing import Mapping
|
||||
from typing import NotRequired
|
||||
from typing import Optional
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
from typing_extensions import Literal, TypedDict
|
||||
|
||||
class SDKInfo(TypedDict):
|
||||
name: str
|
||||
version: str
|
||||
packages: Sequence[Mapping[str, str]]
|
||||
|
||||
# "critical" is an alias of "fatal" recognized by Relay
|
||||
LogLevelStr = Literal["fatal", "critical", "error", "warning", "info", "debug"]
|
||||
|
||||
DurationUnit = Literal[
|
||||
"nanosecond",
|
||||
"microsecond",
|
||||
"millisecond",
|
||||
"second",
|
||||
"minute",
|
||||
"hour",
|
||||
"day",
|
||||
"week",
|
||||
]
|
||||
|
||||
InformationUnit = Literal[
|
||||
"bit",
|
||||
"byte",
|
||||
"kilobyte",
|
||||
"kibibyte",
|
||||
"megabyte",
|
||||
"mebibyte",
|
||||
"gigabyte",
|
||||
"gibibyte",
|
||||
"terabyte",
|
||||
"tebibyte",
|
||||
"petabyte",
|
||||
"pebibyte",
|
||||
"exabyte",
|
||||
"exbibyte",
|
||||
]
|
||||
|
||||
FractionUnit = Literal["ratio", "percent"]
|
||||
MeasurementUnit = Union[DurationUnit, InformationUnit, FractionUnit, str]
|
||||
|
||||
MeasurementValue = TypedDict(
|
||||
"MeasurementValue",
|
||||
{
|
||||
"value": float,
|
||||
"unit": NotRequired[Optional[MeasurementUnit]],
|
||||
},
|
||||
)
|
||||
|
||||
Event = TypedDict(
|
||||
"Event",
|
||||
{
|
||||
"breadcrumbs": dict[
|
||||
Literal["values"], list[dict[str, Any]]
|
||||
], # TODO: We can expand on this type
|
||||
"check_in_id": str,
|
||||
"contexts": dict[str, dict[str, object]],
|
||||
"dist": str,
|
||||
"duration": Optional[float],
|
||||
"environment": str,
|
||||
"errors": list[dict[str, Any]], # TODO: We can expand on this type
|
||||
"event_id": str,
|
||||
"exception": dict[
|
||||
Literal["values"], list[dict[str, Any]]
|
||||
], # TODO: We can expand on this type
|
||||
"extra": MutableMapping[str, object],
|
||||
"fingerprint": list[str],
|
||||
"level": LogLevelStr,
|
||||
"logentry": Mapping[str, object],
|
||||
"logger": str,
|
||||
"measurements": dict[str, MeasurementValue],
|
||||
"message": str,
|
||||
"modules": dict[str, str],
|
||||
"monitor_config": Mapping[str, object],
|
||||
"monitor_slug": Optional[str],
|
||||
"platform": Literal["python"],
|
||||
"profile": object, # Should be sentry_sdk.profiler.Profile, but we can't import that here due to circular imports
|
||||
"release": str,
|
||||
"request": dict[str, object],
|
||||
"sdk": Mapping[str, object],
|
||||
"server_name": str,
|
||||
"spans": Annotated[list[dict[str, object]]],
|
||||
"stacktrace": dict[
|
||||
str, object
|
||||
], # We access this key in the code, but I am unsure whether we ever set it
|
||||
"start_timestamp": datetime,
|
||||
"status": Optional[str],
|
||||
"tags": MutableMapping[
|
||||
str, str
|
||||
], # Tags must be less than 200 characters each
|
||||
"threads": dict[
|
||||
Literal["values"], list[dict[str, Any]]
|
||||
], # TODO: We can expand on this type
|
||||
"timestamp": Optional[datetime], # Must be set before sending the event
|
||||
"transaction": str,
|
||||
"transaction_info": Mapping[str, Any], # TODO: We can expand on this type
|
||||
"type": Literal["check_in", "transaction"],
|
||||
"user": dict[str, object],
|
||||
"_dropped_spans": int,
|
||||
"_metrics_summary": dict[str, object],
|
||||
},
|
||||
total=False,
|
||||
)
|
||||
|
||||
ExcInfo = Union[
|
||||
tuple[Type[BaseException], BaseException, Optional[TracebackType]],
|
||||
tuple[None, None, None],
|
||||
]
|
||||
|
||||
Hint = Dict[str, Any]
|
||||
|
||||
Breadcrumb = Dict[str, Any]
|
||||
BreadcrumbHint = Dict[str, Any]
|
||||
|
||||
SamplingContext = Dict[str, Any]
|
||||
|
||||
EventProcessor = Callable[[Event, Hint], Optional[Event]]
|
||||
ErrorProcessor = Callable[[Event, ExcInfo], Optional[Event]]
|
||||
BreadcrumbProcessor = Callable[[Breadcrumb, BreadcrumbHint], Optional[Breadcrumb]]
|
||||
TransactionProcessor = Callable[[Event, Hint], Optional[Event]]
|
||||
|
||||
TracesSampler = Callable[[SamplingContext], Union[float, int, bool]]
|
||||
|
||||
# https://github.com/python/mypy/issues/5710
|
||||
NotImplementedType = Any
|
||||
|
||||
EventDataCategory = Literal[
|
||||
"default",
|
||||
"error",
|
||||
"crash",
|
||||
"transaction",
|
||||
"security",
|
||||
"attachment",
|
||||
"session",
|
||||
"internal",
|
||||
"profile",
|
||||
"profile_chunk",
|
||||
"metric_bucket",
|
||||
"monitor",
|
||||
"span",
|
||||
]
|
||||
SessionStatus = Literal["ok", "exited", "crashed", "abnormal"]
|
||||
|
||||
ContinuousProfilerMode = Literal["thread", "gevent", "unknown"]
|
||||
ProfilerMode = Union[ContinuousProfilerMode, Literal["sleep"]]
|
||||
|
||||
# Type of the metric.
|
||||
MetricType = Literal["d", "s", "g", "c"]
|
||||
|
||||
# Value of the metric.
|
||||
MetricValue = Union[int, float, str]
|
||||
|
||||
# Internal representation of tags as a tuple of tuples (this is done in order to allow for the same key to exist
|
||||
# multiple times).
|
||||
MetricTagsInternal = Tuple[Tuple[str, str], ...]
|
||||
|
||||
# External representation of tags as a dictionary.
|
||||
MetricTagValue = Union[str, int, float, None]
|
||||
MetricTags = Mapping[str, MetricTagValue]
|
||||
|
||||
# Value inside the generator for the metric value.
|
||||
FlushedMetricValue = Union[int, float]
|
||||
|
||||
BucketKey = Tuple[MetricType, str, MeasurementUnit, MetricTagsInternal]
|
||||
MetricMetaKey = Tuple[MetricType, str, MeasurementUnit]
|
||||
|
||||
MonitorConfigScheduleType = Literal["crontab", "interval"]
|
||||
MonitorConfigScheduleUnit = Literal[
|
||||
"year",
|
||||
"month",
|
||||
"week",
|
||||
"day",
|
||||
"hour",
|
||||
"minute",
|
||||
"second", # not supported in Sentry and will result in a warning
|
||||
]
|
||||
|
||||
MonitorConfigSchedule = TypedDict(
|
||||
"MonitorConfigSchedule",
|
||||
{
|
||||
"type": MonitorConfigScheduleType,
|
||||
"value": Union[int, str],
|
||||
"unit": MonitorConfigScheduleUnit,
|
||||
},
|
||||
total=False,
|
||||
)
|
||||
|
||||
MonitorConfig = TypedDict(
|
||||
"MonitorConfig",
|
||||
{
|
||||
"schedule": MonitorConfigSchedule,
|
||||
"timezone": str,
|
||||
"checkin_margin": int,
|
||||
"max_runtime": int,
|
||||
"failure_issue_threshold": int,
|
||||
"recovery_threshold": int,
|
||||
},
|
||||
total=False,
|
||||
)
|
||||
|
||||
HttpStatusCodeRange = Union[int, Container[int]]
|
||||
@@ -0,0 +1,98 @@
|
||||
"""
|
||||
Copyright (c) 2007 by the Pallets team.
|
||||
|
||||
Some rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
|
||||
* Redistributions of source code must retain the above copyright notice,
|
||||
this list of conditions and the following disclaimer.
|
||||
|
||||
* Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
* Neither the name of the copyright holder nor the names of its
|
||||
contributors may be used to endorse or promote products derived from
|
||||
this software without specific prior written permission.
|
||||
|
||||
THIS SOFTWARE AND DOCUMENTATION IS PROVIDED BY THE COPYRIGHT HOLDERS AND
|
||||
CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING,
|
||||
BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND
|
||||
FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
|
||||
COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||
INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
|
||||
NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF
|
||||
USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
|
||||
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
|
||||
THIS SOFTWARE AND DOCUMENTATION, EVEN IF ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGE.
|
||||
"""
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Dict
|
||||
from typing import Iterator
|
||||
from typing import Tuple
|
||||
|
||||
|
||||
#
|
||||
# `get_headers` comes from `werkzeug.datastructures.EnvironHeaders`
|
||||
# https://github.com/pallets/werkzeug/blob/0.14.1/werkzeug/datastructures.py#L1361
|
||||
#
|
||||
# We need this function because Django does not give us a "pure" http header
|
||||
# dict. So we might as well use it for all WSGI integrations.
|
||||
#
|
||||
def _get_headers(environ):
|
||||
# type: (Dict[str, str]) -> Iterator[Tuple[str, str]]
|
||||
"""
|
||||
Returns only proper HTTP headers.
|
||||
"""
|
||||
for key, value in environ.items():
|
||||
key = str(key)
|
||||
if key.startswith("HTTP_") and key not in (
|
||||
"HTTP_CONTENT_TYPE",
|
||||
"HTTP_CONTENT_LENGTH",
|
||||
):
|
||||
yield key[5:].replace("_", "-").title(), value
|
||||
elif key in ("CONTENT_TYPE", "CONTENT_LENGTH"):
|
||||
yield key.replace("_", "-").title(), value
|
||||
|
||||
|
||||
#
|
||||
# `get_host` comes from `werkzeug.wsgi.get_host`
|
||||
# https://github.com/pallets/werkzeug/blob/1.0.1/src/werkzeug/wsgi.py#L145
|
||||
#
|
||||
def get_host(environ, use_x_forwarded_for=False):
|
||||
# type: (Dict[str, str], bool) -> str
|
||||
"""
|
||||
Return the host for the given WSGI environment.
|
||||
"""
|
||||
if use_x_forwarded_for and "HTTP_X_FORWARDED_HOST" in environ:
|
||||
rv = environ["HTTP_X_FORWARDED_HOST"]
|
||||
if environ["wsgi.url_scheme"] == "http" and rv.endswith(":80"):
|
||||
rv = rv[:-3]
|
||||
elif environ["wsgi.url_scheme"] == "https" and rv.endswith(":443"):
|
||||
rv = rv[:-4]
|
||||
elif environ.get("HTTP_HOST"):
|
||||
rv = environ["HTTP_HOST"]
|
||||
if environ["wsgi.url_scheme"] == "http" and rv.endswith(":80"):
|
||||
rv = rv[:-3]
|
||||
elif environ["wsgi.url_scheme"] == "https" and rv.endswith(":443"):
|
||||
rv = rv[:-4]
|
||||
elif environ.get("SERVER_NAME"):
|
||||
rv = environ["SERVER_NAME"]
|
||||
if (environ["wsgi.url_scheme"], environ["SERVER_PORT"]) not in (
|
||||
("https", "443"),
|
||||
("http", "80"),
|
||||
):
|
||||
rv += ":" + environ["SERVER_PORT"]
|
||||
else:
|
||||
# In spite of the WSGI spec, SERVER_NAME might not be present.
|
||||
rv = "unknown"
|
||||
|
||||
return rv
|
||||
@@ -0,0 +1,115 @@
|
||||
import inspect
|
||||
from functools import wraps
|
||||
|
||||
import sentry_sdk.utils
|
||||
from sentry_sdk import start_span
|
||||
from sentry_sdk.tracing import Span
|
||||
from sentry_sdk.utils import ContextVar
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Optional, Callable, Any
|
||||
|
||||
_ai_pipeline_name = ContextVar("ai_pipeline_name", default=None)
|
||||
|
||||
|
||||
def set_ai_pipeline_name(name):
|
||||
# type: (Optional[str]) -> None
|
||||
_ai_pipeline_name.set(name)
|
||||
|
||||
|
||||
def get_ai_pipeline_name():
|
||||
# type: () -> Optional[str]
|
||||
return _ai_pipeline_name.get()
|
||||
|
||||
|
||||
def ai_track(description, **span_kwargs):
|
||||
# type: (str, Any) -> Callable[..., Any]
|
||||
def decorator(f):
|
||||
# type: (Callable[..., Any]) -> Callable[..., Any]
|
||||
def sync_wrapped(*args, **kwargs):
|
||||
# type: (Any, Any) -> Any
|
||||
curr_pipeline = _ai_pipeline_name.get()
|
||||
op = span_kwargs.get("op", "ai.run" if curr_pipeline else "ai.pipeline")
|
||||
|
||||
with start_span(name=description, op=op, **span_kwargs) as span:
|
||||
for k, v in kwargs.pop("sentry_tags", {}).items():
|
||||
span.set_tag(k, v)
|
||||
for k, v in kwargs.pop("sentry_data", {}).items():
|
||||
span.set_data(k, v)
|
||||
if curr_pipeline:
|
||||
span.set_data("ai.pipeline.name", curr_pipeline)
|
||||
return f(*args, **kwargs)
|
||||
else:
|
||||
_ai_pipeline_name.set(description)
|
||||
try:
|
||||
res = f(*args, **kwargs)
|
||||
except Exception as e:
|
||||
event, hint = sentry_sdk.utils.event_from_exception(
|
||||
e,
|
||||
client_options=sentry_sdk.get_client().options,
|
||||
mechanism={"type": "ai_monitoring", "handled": False},
|
||||
)
|
||||
sentry_sdk.capture_event(event, hint=hint)
|
||||
raise e from None
|
||||
finally:
|
||||
_ai_pipeline_name.set(None)
|
||||
return res
|
||||
|
||||
async def async_wrapped(*args, **kwargs):
|
||||
# type: (Any, Any) -> Any
|
||||
curr_pipeline = _ai_pipeline_name.get()
|
||||
op = span_kwargs.get("op", "ai.run" if curr_pipeline else "ai.pipeline")
|
||||
|
||||
with start_span(name=description, op=op, **span_kwargs) as span:
|
||||
for k, v in kwargs.pop("sentry_tags", {}).items():
|
||||
span.set_tag(k, v)
|
||||
for k, v in kwargs.pop("sentry_data", {}).items():
|
||||
span.set_data(k, v)
|
||||
if curr_pipeline:
|
||||
span.set_data("ai.pipeline.name", curr_pipeline)
|
||||
return await f(*args, **kwargs)
|
||||
else:
|
||||
_ai_pipeline_name.set(description)
|
||||
try:
|
||||
res = await f(*args, **kwargs)
|
||||
except Exception as e:
|
||||
event, hint = sentry_sdk.utils.event_from_exception(
|
||||
e,
|
||||
client_options=sentry_sdk.get_client().options,
|
||||
mechanism={"type": "ai_monitoring", "handled": False},
|
||||
)
|
||||
sentry_sdk.capture_event(event, hint=hint)
|
||||
raise e from None
|
||||
finally:
|
||||
_ai_pipeline_name.set(None)
|
||||
return res
|
||||
|
||||
if inspect.iscoroutinefunction(f):
|
||||
return wraps(f)(async_wrapped)
|
||||
else:
|
||||
return wraps(f)(sync_wrapped)
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
def record_token_usage(
|
||||
span, prompt_tokens=None, completion_tokens=None, total_tokens=None
|
||||
):
|
||||
# type: (Span, Optional[int], Optional[int], Optional[int]) -> None
|
||||
ai_pipeline_name = get_ai_pipeline_name()
|
||||
if ai_pipeline_name:
|
||||
span.set_data("ai.pipeline.name", ai_pipeline_name)
|
||||
if prompt_tokens is not None:
|
||||
span.set_measurement("ai_prompt_tokens_used", value=prompt_tokens)
|
||||
if completion_tokens is not None:
|
||||
span.set_measurement("ai_completion_tokens_used", value=completion_tokens)
|
||||
if (
|
||||
total_tokens is None
|
||||
and prompt_tokens is not None
|
||||
and completion_tokens is not None
|
||||
):
|
||||
total_tokens = prompt_tokens + completion_tokens
|
||||
if total_tokens is not None:
|
||||
span.set_measurement("ai_total_tokens_used", total_tokens)
|
||||
@@ -0,0 +1,32 @@
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Any
|
||||
|
||||
from sentry_sdk.tracing import Span
|
||||
from sentry_sdk.utils import logger
|
||||
|
||||
|
||||
def _normalize_data(data):
|
||||
# type: (Any) -> Any
|
||||
|
||||
# convert pydantic data (e.g. OpenAI v1+) to json compatible format
|
||||
if hasattr(data, "model_dump"):
|
||||
try:
|
||||
return data.model_dump()
|
||||
except Exception as e:
|
||||
logger.warning("Could not convert pydantic data to JSON: %s", e)
|
||||
return data
|
||||
if isinstance(data, list):
|
||||
if len(data) == 1:
|
||||
return _normalize_data(data[0]) # remove empty dimensions
|
||||
return list(_normalize_data(x) for x in data)
|
||||
if isinstance(data, dict):
|
||||
return {k: _normalize_data(v) for (k, v) in data.items()}
|
||||
return data
|
||||
|
||||
|
||||
def set_data_normalized(span, key, value):
|
||||
# type: (Span, str, Any) -> None
|
||||
normalized = _normalize_data(value)
|
||||
span.set_data(key, normalized)
|
||||
@@ -0,0 +1,433 @@
|
||||
import inspect
|
||||
import warnings
|
||||
from contextlib import contextmanager
|
||||
|
||||
from sentry_sdk import tracing_utils, Client
|
||||
from sentry_sdk._init_implementation import init
|
||||
from sentry_sdk.consts import INSTRUMENTER
|
||||
from sentry_sdk.scope import Scope, _ScopeManager, new_scope, isolation_scope
|
||||
from sentry_sdk.tracing import NoOpSpan, Transaction, trace
|
||||
from sentry_sdk.crons import monitor
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import Mapping
|
||||
|
||||
from typing import Any
|
||||
from typing import Dict
|
||||
from typing import Generator
|
||||
from typing import Optional
|
||||
from typing import overload
|
||||
from typing import Callable
|
||||
from typing import TypeVar
|
||||
from typing import ContextManager
|
||||
from typing import Union
|
||||
|
||||
from typing_extensions import Unpack
|
||||
|
||||
from sentry_sdk.client import BaseClient
|
||||
from sentry_sdk._types import (
|
||||
Event,
|
||||
Hint,
|
||||
Breadcrumb,
|
||||
BreadcrumbHint,
|
||||
ExcInfo,
|
||||
MeasurementUnit,
|
||||
LogLevelStr,
|
||||
SamplingContext,
|
||||
)
|
||||
from sentry_sdk.tracing import Span, TransactionKwargs
|
||||
|
||||
T = TypeVar("T")
|
||||
F = TypeVar("F", bound=Callable[..., Any])
|
||||
else:
|
||||
|
||||
def overload(x):
|
||||
# type: (T) -> T
|
||||
return x
|
||||
|
||||
|
||||
# When changing this, update __all__ in __init__.py too
|
||||
__all__ = [
|
||||
"init",
|
||||
"add_breadcrumb",
|
||||
"capture_event",
|
||||
"capture_exception",
|
||||
"capture_message",
|
||||
"configure_scope",
|
||||
"continue_trace",
|
||||
"flush",
|
||||
"get_baggage",
|
||||
"get_client",
|
||||
"get_global_scope",
|
||||
"get_isolation_scope",
|
||||
"get_current_scope",
|
||||
"get_current_span",
|
||||
"get_traceparent",
|
||||
"is_initialized",
|
||||
"isolation_scope",
|
||||
"last_event_id",
|
||||
"new_scope",
|
||||
"push_scope",
|
||||
"set_context",
|
||||
"set_extra",
|
||||
"set_level",
|
||||
"set_measurement",
|
||||
"set_tag",
|
||||
"set_tags",
|
||||
"set_user",
|
||||
"start_span",
|
||||
"start_transaction",
|
||||
"trace",
|
||||
"monitor",
|
||||
]
|
||||
|
||||
|
||||
def scopemethod(f):
|
||||
# type: (F) -> F
|
||||
f.__doc__ = "%s\n\n%s" % (
|
||||
"Alias for :py:meth:`sentry_sdk.Scope.%s`" % f.__name__,
|
||||
inspect.getdoc(getattr(Scope, f.__name__)),
|
||||
)
|
||||
return f
|
||||
|
||||
|
||||
def clientmethod(f):
|
||||
# type: (F) -> F
|
||||
f.__doc__ = "%s\n\n%s" % (
|
||||
"Alias for :py:meth:`sentry_sdk.Client.%s`" % f.__name__,
|
||||
inspect.getdoc(getattr(Client, f.__name__)),
|
||||
)
|
||||
return f
|
||||
|
||||
|
||||
@scopemethod
|
||||
def get_client():
|
||||
# type: () -> BaseClient
|
||||
return Scope.get_client()
|
||||
|
||||
|
||||
def is_initialized():
|
||||
# type: () -> bool
|
||||
"""
|
||||
.. versionadded:: 2.0.0
|
||||
|
||||
Returns whether Sentry has been initialized or not.
|
||||
|
||||
If a client is available and the client is active
|
||||
(meaning it is configured to send data) then
|
||||
Sentry is initialized.
|
||||
"""
|
||||
return get_client().is_active()
|
||||
|
||||
|
||||
@scopemethod
|
||||
def get_global_scope():
|
||||
# type: () -> Scope
|
||||
return Scope.get_global_scope()
|
||||
|
||||
|
||||
@scopemethod
|
||||
def get_isolation_scope():
|
||||
# type: () -> Scope
|
||||
return Scope.get_isolation_scope()
|
||||
|
||||
|
||||
@scopemethod
|
||||
def get_current_scope():
|
||||
# type: () -> Scope
|
||||
return Scope.get_current_scope()
|
||||
|
||||
|
||||
@scopemethod
|
||||
def last_event_id():
|
||||
# type: () -> Optional[str]
|
||||
"""
|
||||
See :py:meth:`sentry_sdk.Scope.last_event_id` documentation regarding
|
||||
this method's limitations.
|
||||
"""
|
||||
return Scope.last_event_id()
|
||||
|
||||
|
||||
@scopemethod
|
||||
def capture_event(
|
||||
event, # type: Event
|
||||
hint=None, # type: Optional[Hint]
|
||||
scope=None, # type: Optional[Any]
|
||||
**scope_kwargs, # type: Any
|
||||
):
|
||||
# type: (...) -> Optional[str]
|
||||
return get_current_scope().capture_event(event, hint, scope=scope, **scope_kwargs)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def capture_message(
|
||||
message, # type: str
|
||||
level=None, # type: Optional[LogLevelStr]
|
||||
scope=None, # type: Optional[Any]
|
||||
**scope_kwargs, # type: Any
|
||||
):
|
||||
# type: (...) -> Optional[str]
|
||||
return get_current_scope().capture_message(
|
||||
message, level, scope=scope, **scope_kwargs
|
||||
)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def capture_exception(
|
||||
error=None, # type: Optional[Union[BaseException, ExcInfo]]
|
||||
scope=None, # type: Optional[Any]
|
||||
**scope_kwargs, # type: Any
|
||||
):
|
||||
# type: (...) -> Optional[str]
|
||||
return get_current_scope().capture_exception(error, scope=scope, **scope_kwargs)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def add_breadcrumb(
|
||||
crumb=None, # type: Optional[Breadcrumb]
|
||||
hint=None, # type: Optional[BreadcrumbHint]
|
||||
**kwargs, # type: Any
|
||||
):
|
||||
# type: (...) -> None
|
||||
return get_isolation_scope().add_breadcrumb(crumb, hint, **kwargs)
|
||||
|
||||
|
||||
@overload
|
||||
def configure_scope():
|
||||
# type: () -> ContextManager[Scope]
|
||||
pass
|
||||
|
||||
|
||||
@overload
|
||||
def configure_scope( # noqa: F811
|
||||
callback, # type: Callable[[Scope], None]
|
||||
):
|
||||
# type: (...) -> None
|
||||
pass
|
||||
|
||||
|
||||
def configure_scope( # noqa: F811
|
||||
callback=None, # type: Optional[Callable[[Scope], None]]
|
||||
):
|
||||
# type: (...) -> Optional[ContextManager[Scope]]
|
||||
"""
|
||||
Reconfigures the scope.
|
||||
|
||||
:param callback: If provided, call the callback with the current scope.
|
||||
|
||||
:returns: If no callback is provided, returns a context manager that returns the scope.
|
||||
"""
|
||||
warnings.warn(
|
||||
"sentry_sdk.configure_scope is deprecated and will be removed in the next major version. "
|
||||
"Please consult our migration guide to learn how to migrate to the new API: "
|
||||
"https://docs.sentry.io/platforms/python/migration/1.x-to-2.x#scope-configuring",
|
||||
DeprecationWarning,
|
||||
stacklevel=2,
|
||||
)
|
||||
|
||||
scope = get_isolation_scope()
|
||||
scope.generate_propagation_context()
|
||||
|
||||
if callback is not None:
|
||||
# TODO: used to return None when client is None. Check if this changes behavior.
|
||||
callback(scope)
|
||||
|
||||
return None
|
||||
|
||||
@contextmanager
|
||||
def inner():
|
||||
# type: () -> Generator[Scope, None, None]
|
||||
yield scope
|
||||
|
||||
return inner()
|
||||
|
||||
|
||||
@overload
|
||||
def push_scope():
|
||||
# type: () -> ContextManager[Scope]
|
||||
pass
|
||||
|
||||
|
||||
@overload
|
||||
def push_scope( # noqa: F811
|
||||
callback, # type: Callable[[Scope], None]
|
||||
):
|
||||
# type: (...) -> None
|
||||
pass
|
||||
|
||||
|
||||
def push_scope( # noqa: F811
|
||||
callback=None, # type: Optional[Callable[[Scope], None]]
|
||||
):
|
||||
# type: (...) -> Optional[ContextManager[Scope]]
|
||||
"""
|
||||
Pushes a new layer on the scope stack.
|
||||
|
||||
:param callback: If provided, this method pushes a scope, calls
|
||||
`callback`, and pops the scope again.
|
||||
|
||||
:returns: If no `callback` is provided, a context manager that should
|
||||
be used to pop the scope again.
|
||||
"""
|
||||
warnings.warn(
|
||||
"sentry_sdk.push_scope is deprecated and will be removed in the next major version. "
|
||||
"Please consult our migration guide to learn how to migrate to the new API: "
|
||||
"https://docs.sentry.io/platforms/python/migration/1.x-to-2.x#scope-pushing",
|
||||
DeprecationWarning,
|
||||
stacklevel=2,
|
||||
)
|
||||
|
||||
if callback is not None:
|
||||
with warnings.catch_warnings():
|
||||
warnings.simplefilter("ignore", DeprecationWarning)
|
||||
with push_scope() as scope:
|
||||
callback(scope)
|
||||
return None
|
||||
|
||||
return _ScopeManager()
|
||||
|
||||
|
||||
@scopemethod
|
||||
def set_tag(key, value):
|
||||
# type: (str, Any) -> None
|
||||
return get_isolation_scope().set_tag(key, value)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def set_tags(tags):
|
||||
# type: (Mapping[str, object]) -> None
|
||||
return get_isolation_scope().set_tags(tags)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def set_context(key, value):
|
||||
# type: (str, Dict[str, Any]) -> None
|
||||
return get_isolation_scope().set_context(key, value)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def set_extra(key, value):
|
||||
# type: (str, Any) -> None
|
||||
return get_isolation_scope().set_extra(key, value)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def set_user(value):
|
||||
# type: (Optional[Dict[str, Any]]) -> None
|
||||
return get_isolation_scope().set_user(value)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def set_level(value):
|
||||
# type: (LogLevelStr) -> None
|
||||
return get_isolation_scope().set_level(value)
|
||||
|
||||
|
||||
@clientmethod
|
||||
def flush(
|
||||
timeout=None, # type: Optional[float]
|
||||
callback=None, # type: Optional[Callable[[int, float], None]]
|
||||
):
|
||||
# type: (...) -> None
|
||||
return get_client().flush(timeout=timeout, callback=callback)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def start_span(
|
||||
**kwargs, # type: Any
|
||||
):
|
||||
# type: (...) -> Span
|
||||
return get_current_scope().start_span(**kwargs)
|
||||
|
||||
|
||||
@scopemethod
|
||||
def start_transaction(
|
||||
transaction=None, # type: Optional[Transaction]
|
||||
instrumenter=INSTRUMENTER.SENTRY, # type: str
|
||||
custom_sampling_context=None, # type: Optional[SamplingContext]
|
||||
**kwargs, # type: Unpack[TransactionKwargs]
|
||||
):
|
||||
# type: (...) -> Union[Transaction, NoOpSpan]
|
||||
"""
|
||||
Start and return a transaction on the current scope.
|
||||
|
||||
Start an existing transaction if given, otherwise create and start a new
|
||||
transaction with kwargs.
|
||||
|
||||
This is the entry point to manual tracing instrumentation.
|
||||
|
||||
A tree structure can be built by adding child spans to the transaction,
|
||||
and child spans to other spans. To start a new child span within the
|
||||
transaction or any span, call the respective `.start_child()` method.
|
||||
|
||||
Every child span must be finished before the transaction is finished,
|
||||
otherwise the unfinished spans are discarded.
|
||||
|
||||
When used as context managers, spans and transactions are automatically
|
||||
finished at the end of the `with` block. If not using context managers,
|
||||
call the `.finish()` method.
|
||||
|
||||
When the transaction is finished, it will be sent to Sentry with all its
|
||||
finished child spans.
|
||||
|
||||
:param transaction: The transaction to start. If omitted, we create and
|
||||
start a new transaction.
|
||||
:param instrumenter: This parameter is meant for internal use only. It
|
||||
will be removed in the next major version.
|
||||
:param custom_sampling_context: The transaction's custom sampling context.
|
||||
:param kwargs: Optional keyword arguments to be passed to the Transaction
|
||||
constructor. See :py:class:`sentry_sdk.tracing.Transaction` for
|
||||
available arguments.
|
||||
"""
|
||||
return get_current_scope().start_transaction(
|
||||
transaction, instrumenter, custom_sampling_context, **kwargs
|
||||
)
|
||||
|
||||
|
||||
def set_measurement(name, value, unit=""):
|
||||
# type: (str, float, MeasurementUnit) -> None
|
||||
transaction = get_current_scope().transaction
|
||||
if transaction is not None:
|
||||
transaction.set_measurement(name, value, unit)
|
||||
|
||||
|
||||
def get_current_span(scope=None):
|
||||
# type: (Optional[Scope]) -> Optional[Span]
|
||||
"""
|
||||
Returns the currently active span if there is one running, otherwise `None`
|
||||
"""
|
||||
return tracing_utils.get_current_span(scope)
|
||||
|
||||
|
||||
def get_traceparent():
|
||||
# type: () -> Optional[str]
|
||||
"""
|
||||
Returns the traceparent either from the active span or from the scope.
|
||||
"""
|
||||
return get_current_scope().get_traceparent()
|
||||
|
||||
|
||||
def get_baggage():
|
||||
# type: () -> Optional[str]
|
||||
"""
|
||||
Returns Baggage either from the active span or from the scope.
|
||||
"""
|
||||
baggage = get_current_scope().get_baggage()
|
||||
if baggage is not None:
|
||||
return baggage.serialize()
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def continue_trace(
|
||||
environ_or_headers, op=None, name=None, source=None, origin="manual"
|
||||
):
|
||||
# type: (Dict[str, Any], Optional[str], Optional[str], Optional[str], str) -> Transaction
|
||||
"""
|
||||
Sets the propagation context from environment or headers and returns a transaction.
|
||||
"""
|
||||
return get_isolation_scope().continue_trace(
|
||||
environ_or_headers, op, name, source, origin
|
||||
)
|
||||
@@ -0,0 +1,75 @@
|
||||
import os
|
||||
import mimetypes
|
||||
|
||||
from sentry_sdk.envelope import Item, PayloadRef
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Optional, Union, Callable
|
||||
|
||||
|
||||
class Attachment:
|
||||
"""Additional files/data to send along with an event.
|
||||
|
||||
This class stores attachments that can be sent along with an event. Attachments are files or other data, e.g.
|
||||
config or log files, that are relevant to an event. Attachments are set on the ``Scope``, and are sent along with
|
||||
all non-transaction events (or all events including transactions if ``add_to_transactions`` is ``True``) that are
|
||||
captured within the ``Scope``.
|
||||
|
||||
To add an attachment to a ``Scope``, use :py:meth:`sentry_sdk.Scope.add_attachment`. The parameters for
|
||||
``add_attachment`` are the same as the parameters for this class's constructor.
|
||||
|
||||
:param bytes: Raw bytes of the attachment, or a function that returns the raw bytes. Must be provided unless
|
||||
``path`` is provided.
|
||||
:param filename: The filename of the attachment. Must be provided unless ``path`` is provided.
|
||||
:param path: Path to a file to attach. Must be provided unless ``bytes`` is provided.
|
||||
:param content_type: The content type of the attachment. If not provided, it will be guessed from the ``filename``
|
||||
parameter, if available, or the ``path`` parameter if ``filename`` is ``None``.
|
||||
:param add_to_transactions: Whether to add this attachment to transactions. Defaults to ``False``.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
bytes=None, # type: Union[None, bytes, Callable[[], bytes]]
|
||||
filename=None, # type: Optional[str]
|
||||
path=None, # type: Optional[str]
|
||||
content_type=None, # type: Optional[str]
|
||||
add_to_transactions=False, # type: bool
|
||||
):
|
||||
# type: (...) -> None
|
||||
if bytes is None and path is None:
|
||||
raise TypeError("path or raw bytes required for attachment")
|
||||
if filename is None and path is not None:
|
||||
filename = os.path.basename(path)
|
||||
if filename is None:
|
||||
raise TypeError("filename is required for attachment")
|
||||
if content_type is None:
|
||||
content_type = mimetypes.guess_type(filename)[0]
|
||||
self.bytes = bytes
|
||||
self.filename = filename
|
||||
self.path = path
|
||||
self.content_type = content_type
|
||||
self.add_to_transactions = add_to_transactions
|
||||
|
||||
def to_envelope_item(self):
|
||||
# type: () -> Item
|
||||
"""Returns an envelope item for this attachment."""
|
||||
payload = None # type: Union[None, PayloadRef, bytes]
|
||||
if self.bytes is not None:
|
||||
if callable(self.bytes):
|
||||
payload = self.bytes()
|
||||
else:
|
||||
payload = self.bytes
|
||||
else:
|
||||
payload = PayloadRef(path=self.path)
|
||||
return Item(
|
||||
payload=payload,
|
||||
type="attachment",
|
||||
content_type=self.content_type,
|
||||
filename=self.filename,
|
||||
)
|
||||
|
||||
def __repr__(self):
|
||||
# type: () -> str
|
||||
return "<Attachment %r>" % (self.filename,)
|
||||
@@ -0,0 +1,959 @@
|
||||
import os
|
||||
import uuid
|
||||
import random
|
||||
import socket
|
||||
from collections.abc import Mapping
|
||||
from datetime import datetime, timezone
|
||||
from importlib import import_module
|
||||
from typing import TYPE_CHECKING, List, Dict, cast, overload
|
||||
import warnings
|
||||
|
||||
from sentry_sdk._compat import PY37, check_uwsgi_thread_support
|
||||
from sentry_sdk.utils import (
|
||||
AnnotatedValue,
|
||||
ContextVar,
|
||||
capture_internal_exceptions,
|
||||
current_stacktrace,
|
||||
env_to_bool,
|
||||
format_timestamp,
|
||||
get_sdk_name,
|
||||
get_type_name,
|
||||
get_default_release,
|
||||
handle_in_app,
|
||||
is_gevent,
|
||||
logger,
|
||||
)
|
||||
from sentry_sdk.serializer import serialize
|
||||
from sentry_sdk.tracing import trace
|
||||
from sentry_sdk.transport import BaseHttpTransport, make_transport
|
||||
from sentry_sdk.consts import (
|
||||
DEFAULT_MAX_VALUE_LENGTH,
|
||||
DEFAULT_OPTIONS,
|
||||
INSTRUMENTER,
|
||||
VERSION,
|
||||
ClientConstructor,
|
||||
)
|
||||
from sentry_sdk.integrations import _DEFAULT_INTEGRATIONS, setup_integrations
|
||||
from sentry_sdk.sessions import SessionFlusher
|
||||
from sentry_sdk.envelope import Envelope
|
||||
from sentry_sdk.profiler.continuous_profiler import setup_continuous_profiler
|
||||
from sentry_sdk.profiler.transaction_profiler import (
|
||||
has_profiling_enabled,
|
||||
Profile,
|
||||
setup_profiler,
|
||||
)
|
||||
from sentry_sdk.scrubber import EventScrubber
|
||||
from sentry_sdk.monitor import Monitor
|
||||
from sentry_sdk.spotlight import setup_spotlight
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import Optional
|
||||
from typing import Sequence
|
||||
from typing import Type
|
||||
from typing import Union
|
||||
from typing import TypeVar
|
||||
|
||||
from sentry_sdk._types import Event, Hint, SDKInfo
|
||||
from sentry_sdk.integrations import Integration
|
||||
from sentry_sdk.metrics import MetricsAggregator
|
||||
from sentry_sdk.scope import Scope
|
||||
from sentry_sdk.session import Session
|
||||
from sentry_sdk.spotlight import SpotlightClient
|
||||
from sentry_sdk.transport import Transport
|
||||
|
||||
I = TypeVar("I", bound=Integration) # noqa: E741
|
||||
|
||||
_client_init_debug = ContextVar("client_init_debug")
|
||||
|
||||
|
||||
SDK_INFO = {
|
||||
"name": "sentry.python", # SDK name will be overridden after integrations have been loaded with sentry_sdk.integrations.setup_integrations()
|
||||
"version": VERSION,
|
||||
"packages": [{"name": "pypi:sentry-sdk", "version": VERSION}],
|
||||
} # type: SDKInfo
|
||||
|
||||
|
||||
def _get_options(*args, **kwargs):
|
||||
# type: (*Optional[str], **Any) -> Dict[str, Any]
|
||||
if args and (isinstance(args[0], (bytes, str)) or args[0] is None):
|
||||
dsn = args[0] # type: Optional[str]
|
||||
args = args[1:]
|
||||
else:
|
||||
dsn = None
|
||||
|
||||
if len(args) > 1:
|
||||
raise TypeError("Only single positional argument is expected")
|
||||
|
||||
rv = dict(DEFAULT_OPTIONS)
|
||||
options = dict(*args, **kwargs)
|
||||
if dsn is not None and options.get("dsn") is None:
|
||||
options["dsn"] = dsn
|
||||
|
||||
for key, value in options.items():
|
||||
if key not in rv:
|
||||
raise TypeError("Unknown option %r" % (key,))
|
||||
|
||||
rv[key] = value
|
||||
|
||||
if rv["dsn"] is None:
|
||||
rv["dsn"] = os.environ.get("SENTRY_DSN")
|
||||
|
||||
if rv["release"] is None:
|
||||
rv["release"] = get_default_release()
|
||||
|
||||
if rv["environment"] is None:
|
||||
rv["environment"] = os.environ.get("SENTRY_ENVIRONMENT") or "production"
|
||||
|
||||
if rv["debug"] is None:
|
||||
rv["debug"] = env_to_bool(os.environ.get("SENTRY_DEBUG", "False"), strict=True)
|
||||
|
||||
if rv["server_name"] is None and hasattr(socket, "gethostname"):
|
||||
rv["server_name"] = socket.gethostname()
|
||||
|
||||
if rv["instrumenter"] is None:
|
||||
rv["instrumenter"] = INSTRUMENTER.SENTRY
|
||||
|
||||
if rv["project_root"] is None:
|
||||
try:
|
||||
project_root = os.getcwd()
|
||||
except Exception:
|
||||
project_root = None
|
||||
|
||||
rv["project_root"] = project_root
|
||||
|
||||
if rv["enable_tracing"] is True and rv["traces_sample_rate"] is None:
|
||||
rv["traces_sample_rate"] = 1.0
|
||||
|
||||
if rv["event_scrubber"] is None:
|
||||
rv["event_scrubber"] = EventScrubber(
|
||||
send_default_pii=(
|
||||
False if rv["send_default_pii"] is None else rv["send_default_pii"]
|
||||
)
|
||||
)
|
||||
|
||||
if rv["socket_options"] and not isinstance(rv["socket_options"], list):
|
||||
logger.warning(
|
||||
"Ignoring socket_options because of unexpected format. See urllib3.HTTPConnection.socket_options for the expected format."
|
||||
)
|
||||
rv["socket_options"] = None
|
||||
|
||||
if rv["enable_tracing"] is not None:
|
||||
warnings.warn(
|
||||
"The `enable_tracing` parameter is deprecated. Please use `traces_sample_rate` instead.",
|
||||
DeprecationWarning,
|
||||
stacklevel=2,
|
||||
)
|
||||
|
||||
return rv
|
||||
|
||||
|
||||
try:
|
||||
# Python 3.6+
|
||||
module_not_found_error = ModuleNotFoundError
|
||||
except Exception:
|
||||
# Older Python versions
|
||||
module_not_found_error = ImportError # type: ignore
|
||||
|
||||
|
||||
class BaseClient:
|
||||
"""
|
||||
.. versionadded:: 2.0.0
|
||||
|
||||
The basic definition of a client that is used for sending data to Sentry.
|
||||
"""
|
||||
|
||||
spotlight = None # type: Optional[SpotlightClient]
|
||||
|
||||
def __init__(self, options=None):
|
||||
# type: (Optional[Dict[str, Any]]) -> None
|
||||
self.options = (
|
||||
options if options is not None else DEFAULT_OPTIONS
|
||||
) # type: Dict[str, Any]
|
||||
|
||||
self.transport = None # type: Optional[Transport]
|
||||
self.monitor = None # type: Optional[Monitor]
|
||||
self.metrics_aggregator = None # type: Optional[MetricsAggregator]
|
||||
|
||||
def __getstate__(self, *args, **kwargs):
|
||||
# type: (*Any, **Any) -> Any
|
||||
return {"options": {}}
|
||||
|
||||
def __setstate__(self, *args, **kwargs):
|
||||
# type: (*Any, **Any) -> None
|
||||
pass
|
||||
|
||||
@property
|
||||
def dsn(self):
|
||||
# type: () -> Optional[str]
|
||||
return None
|
||||
|
||||
def should_send_default_pii(self):
|
||||
# type: () -> bool
|
||||
return False
|
||||
|
||||
def is_active(self):
|
||||
# type: () -> bool
|
||||
"""
|
||||
.. versionadded:: 2.0.0
|
||||
|
||||
Returns whether the client is active (able to send data to Sentry)
|
||||
"""
|
||||
return False
|
||||
|
||||
def capture_event(self, *args, **kwargs):
|
||||
# type: (*Any, **Any) -> Optional[str]
|
||||
return None
|
||||
|
||||
def capture_session(self, *args, **kwargs):
|
||||
# type: (*Any, **Any) -> None
|
||||
return None
|
||||
|
||||
if TYPE_CHECKING:
|
||||
|
||||
@overload
|
||||
def get_integration(self, name_or_class):
|
||||
# type: (str) -> Optional[Integration]
|
||||
...
|
||||
|
||||
@overload
|
||||
def get_integration(self, name_or_class):
|
||||
# type: (type[I]) -> Optional[I]
|
||||
...
|
||||
|
||||
def get_integration(self, name_or_class):
|
||||
# type: (Union[str, type[Integration]]) -> Optional[Integration]
|
||||
return None
|
||||
|
||||
def close(self, *args, **kwargs):
|
||||
# type: (*Any, **Any) -> None
|
||||
return None
|
||||
|
||||
def flush(self, *args, **kwargs):
|
||||
# type: (*Any, **Any) -> None
|
||||
return None
|
||||
|
||||
def __enter__(self):
|
||||
# type: () -> BaseClient
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_value, tb):
|
||||
# type: (Any, Any, Any) -> None
|
||||
return None
|
||||
|
||||
|
||||
class NonRecordingClient(BaseClient):
|
||||
"""
|
||||
.. versionadded:: 2.0.0
|
||||
|
||||
A client that does not send any events to Sentry. This is used as a fallback when the Sentry SDK is not yet initialized.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class _Client(BaseClient):
|
||||
"""
|
||||
The client is internally responsible for capturing the events and
|
||||
forwarding them to sentry through the configured transport. It takes
|
||||
the client options as keyword arguments and optionally the DSN as first
|
||||
argument.
|
||||
|
||||
Alias of :py:class:`sentry_sdk.Client`. (Was created for better intelisense support)
|
||||
"""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
# type: (*Any, **Any) -> None
|
||||
super(_Client, self).__init__(options=get_options(*args, **kwargs))
|
||||
self._init_impl()
|
||||
|
||||
def __getstate__(self):
|
||||
# type: () -> Any
|
||||
return {"options": self.options}
|
||||
|
||||
def __setstate__(self, state):
|
||||
# type: (Any) -> None
|
||||
self.options = state["options"]
|
||||
self._init_impl()
|
||||
|
||||
def _setup_instrumentation(self, functions_to_trace):
|
||||
# type: (Sequence[Dict[str, str]]) -> None
|
||||
"""
|
||||
Instruments the functions given in the list `functions_to_trace` with the `@sentry_sdk.tracing.trace` decorator.
|
||||
"""
|
||||
for function in functions_to_trace:
|
||||
class_name = None
|
||||
function_qualname = function["qualified_name"]
|
||||
module_name, function_name = function_qualname.rsplit(".", 1)
|
||||
|
||||
try:
|
||||
# Try to import module and function
|
||||
# ex: "mymodule.submodule.funcname"
|
||||
|
||||
module_obj = import_module(module_name)
|
||||
function_obj = getattr(module_obj, function_name)
|
||||
setattr(module_obj, function_name, trace(function_obj))
|
||||
logger.debug("Enabled tracing for %s", function_qualname)
|
||||
except module_not_found_error:
|
||||
try:
|
||||
# Try to import a class
|
||||
# ex: "mymodule.submodule.MyClassName.member_function"
|
||||
|
||||
module_name, class_name = module_name.rsplit(".", 1)
|
||||
module_obj = import_module(module_name)
|
||||
class_obj = getattr(module_obj, class_name)
|
||||
function_obj = getattr(class_obj, function_name)
|
||||
function_type = type(class_obj.__dict__[function_name])
|
||||
traced_function = trace(function_obj)
|
||||
|
||||
if function_type in (staticmethod, classmethod):
|
||||
traced_function = staticmethod(traced_function)
|
||||
|
||||
setattr(class_obj, function_name, traced_function)
|
||||
setattr(module_obj, class_name, class_obj)
|
||||
logger.debug("Enabled tracing for %s", function_qualname)
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(
|
||||
"Can not enable tracing for '%s'. (%s) Please check your `functions_to_trace` parameter.",
|
||||
function_qualname,
|
||||
e,
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(
|
||||
"Can not enable tracing for '%s'. (%s) Please check your `functions_to_trace` parameter.",
|
||||
function_qualname,
|
||||
e,
|
||||
)
|
||||
|
||||
def _init_impl(self):
|
||||
# type: () -> None
|
||||
old_debug = _client_init_debug.get(False)
|
||||
|
||||
def _capture_envelope(envelope):
|
||||
# type: (Envelope) -> None
|
||||
if self.transport is not None:
|
||||
self.transport.capture_envelope(envelope)
|
||||
|
||||
try:
|
||||
_client_init_debug.set(self.options["debug"])
|
||||
self.transport = make_transport(self.options)
|
||||
|
||||
self.monitor = None
|
||||
if self.transport:
|
||||
if self.options["enable_backpressure_handling"]:
|
||||
self.monitor = Monitor(self.transport)
|
||||
|
||||
self.session_flusher = SessionFlusher(capture_func=_capture_envelope)
|
||||
|
||||
self.metrics_aggregator = None # type: Optional[MetricsAggregator]
|
||||
experiments = self.options.get("_experiments", {})
|
||||
if experiments.get("enable_metrics", True):
|
||||
# Context vars are not working correctly on Python <=3.6
|
||||
# with gevent.
|
||||
metrics_supported = not is_gevent() or PY37
|
||||
if metrics_supported:
|
||||
from sentry_sdk.metrics import MetricsAggregator
|
||||
|
||||
self.metrics_aggregator = MetricsAggregator(
|
||||
capture_func=_capture_envelope,
|
||||
enable_code_locations=bool(
|
||||
experiments.get("metric_code_locations", True)
|
||||
),
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
"Metrics not supported on Python 3.6 and lower with gevent."
|
||||
)
|
||||
|
||||
max_request_body_size = ("always", "never", "small", "medium")
|
||||
if self.options["max_request_body_size"] not in max_request_body_size:
|
||||
raise ValueError(
|
||||
"Invalid value for max_request_body_size. Must be one of {}".format(
|
||||
max_request_body_size
|
||||
)
|
||||
)
|
||||
|
||||
if self.options["_experiments"].get("otel_powered_performance", False):
|
||||
logger.debug(
|
||||
"[OTel] Enabling experimental OTel-powered performance monitoring."
|
||||
)
|
||||
self.options["instrumenter"] = INSTRUMENTER.OTEL
|
||||
if (
|
||||
"sentry_sdk.integrations.opentelemetry.integration.OpenTelemetryIntegration"
|
||||
not in _DEFAULT_INTEGRATIONS
|
||||
):
|
||||
_DEFAULT_INTEGRATIONS.append(
|
||||
"sentry_sdk.integrations.opentelemetry.integration.OpenTelemetryIntegration",
|
||||
)
|
||||
|
||||
self.integrations = setup_integrations(
|
||||
self.options["integrations"],
|
||||
with_defaults=self.options["default_integrations"],
|
||||
with_auto_enabling_integrations=self.options[
|
||||
"auto_enabling_integrations"
|
||||
],
|
||||
disabled_integrations=self.options["disabled_integrations"],
|
||||
)
|
||||
|
||||
spotlight_config = self.options.get("spotlight")
|
||||
if spotlight_config is None and "SENTRY_SPOTLIGHT" in os.environ:
|
||||
spotlight_env_value = os.environ["SENTRY_SPOTLIGHT"]
|
||||
spotlight_config = env_to_bool(spotlight_env_value, strict=True)
|
||||
self.options["spotlight"] = (
|
||||
spotlight_config
|
||||
if spotlight_config is not None
|
||||
else spotlight_env_value
|
||||
)
|
||||
|
||||
if self.options.get("spotlight"):
|
||||
self.spotlight = setup_spotlight(self.options)
|
||||
|
||||
sdk_name = get_sdk_name(list(self.integrations.keys()))
|
||||
SDK_INFO["name"] = sdk_name
|
||||
logger.debug("Setting SDK name to '%s'", sdk_name)
|
||||
|
||||
if has_profiling_enabled(self.options):
|
||||
try:
|
||||
setup_profiler(self.options)
|
||||
except Exception as e:
|
||||
logger.debug("Can not set up profiler. (%s)", e)
|
||||
else:
|
||||
try:
|
||||
setup_continuous_profiler(
|
||||
self.options,
|
||||
sdk_info=SDK_INFO,
|
||||
capture_func=_capture_envelope,
|
||||
)
|
||||
except Exception as e:
|
||||
logger.debug("Can not set up continuous profiler. (%s)", e)
|
||||
|
||||
finally:
|
||||
_client_init_debug.set(old_debug)
|
||||
|
||||
self._setup_instrumentation(self.options.get("functions_to_trace", []))
|
||||
|
||||
if (
|
||||
self.monitor
|
||||
or self.metrics_aggregator
|
||||
or has_profiling_enabled(self.options)
|
||||
or isinstance(self.transport, BaseHttpTransport)
|
||||
):
|
||||
# If we have anything on that could spawn a background thread, we
|
||||
# need to check if it's safe to use them.
|
||||
check_uwsgi_thread_support()
|
||||
|
||||
def is_active(self):
|
||||
# type: () -> bool
|
||||
"""
|
||||
.. versionadded:: 2.0.0
|
||||
|
||||
Returns whether the client is active (able to send data to Sentry)
|
||||
"""
|
||||
return True
|
||||
|
||||
def should_send_default_pii(self):
|
||||
# type: () -> bool
|
||||
"""
|
||||
.. versionadded:: 2.0.0
|
||||
|
||||
Returns whether the client should send default PII (Personally Identifiable Information) data to Sentry.
|
||||
"""
|
||||
result = self.options.get("send_default_pii")
|
||||
if result is None:
|
||||
result = not self.options["dsn"] and self.spotlight is not None
|
||||
|
||||
return result
|
||||
|
||||
@property
|
||||
def dsn(self):
|
||||
# type: () -> Optional[str]
|
||||
"""Returns the configured DSN as string."""
|
||||
return self.options["dsn"]
|
||||
|
||||
def _prepare_event(
|
||||
self,
|
||||
event, # type: Event
|
||||
hint, # type: Hint
|
||||
scope, # type: Optional[Scope]
|
||||
):
|
||||
# type: (...) -> Optional[Event]
|
||||
|
||||
previous_total_spans = None # type: Optional[int]
|
||||
|
||||
if event.get("timestamp") is None:
|
||||
event["timestamp"] = datetime.now(timezone.utc)
|
||||
|
||||
if scope is not None:
|
||||
is_transaction = event.get("type") == "transaction"
|
||||
spans_before = len(cast(List[Dict[str, object]], event.get("spans", [])))
|
||||
event_ = scope.apply_to_event(event, hint, self.options)
|
||||
|
||||
# one of the event/error processors returned None
|
||||
if event_ is None:
|
||||
if self.transport:
|
||||
self.transport.record_lost_event(
|
||||
"event_processor",
|
||||
data_category=("transaction" if is_transaction else "error"),
|
||||
)
|
||||
if is_transaction:
|
||||
self.transport.record_lost_event(
|
||||
"event_processor",
|
||||
data_category="span",
|
||||
quantity=spans_before + 1, # +1 for the transaction itself
|
||||
)
|
||||
return None
|
||||
|
||||
event = event_
|
||||
spans_delta = spans_before - len(
|
||||
cast(List[Dict[str, object]], event.get("spans", []))
|
||||
)
|
||||
if is_transaction and spans_delta > 0 and self.transport is not None:
|
||||
self.transport.record_lost_event(
|
||||
"event_processor", data_category="span", quantity=spans_delta
|
||||
)
|
||||
|
||||
dropped_spans = event.pop("_dropped_spans", 0) + spans_delta # type: int
|
||||
if dropped_spans > 0:
|
||||
previous_total_spans = spans_before + dropped_spans
|
||||
|
||||
if (
|
||||
self.options["attach_stacktrace"]
|
||||
and "exception" not in event
|
||||
and "stacktrace" not in event
|
||||
and "threads" not in event
|
||||
):
|
||||
with capture_internal_exceptions():
|
||||
event["threads"] = {
|
||||
"values": [
|
||||
{
|
||||
"stacktrace": current_stacktrace(
|
||||
include_local_variables=self.options.get(
|
||||
"include_local_variables", True
|
||||
),
|
||||
max_value_length=self.options.get(
|
||||
"max_value_length", DEFAULT_MAX_VALUE_LENGTH
|
||||
),
|
||||
),
|
||||
"crashed": False,
|
||||
"current": True,
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
for key in "release", "environment", "server_name", "dist":
|
||||
if event.get(key) is None and self.options[key] is not None:
|
||||
event[key] = str(self.options[key]).strip()
|
||||
if event.get("sdk") is None:
|
||||
sdk_info = dict(SDK_INFO)
|
||||
sdk_info["integrations"] = sorted(self.integrations.keys())
|
||||
event["sdk"] = sdk_info
|
||||
|
||||
if event.get("platform") is None:
|
||||
event["platform"] = "python"
|
||||
|
||||
event = handle_in_app(
|
||||
event,
|
||||
self.options["in_app_exclude"],
|
||||
self.options["in_app_include"],
|
||||
self.options["project_root"],
|
||||
)
|
||||
|
||||
if event is not None:
|
||||
event_scrubber = self.options["event_scrubber"]
|
||||
if event_scrubber:
|
||||
event_scrubber.scrub_event(event)
|
||||
|
||||
if previous_total_spans is not None:
|
||||
event["spans"] = AnnotatedValue(
|
||||
event.get("spans", []), {"len": previous_total_spans}
|
||||
)
|
||||
|
||||
# Postprocess the event here so that annotated types do
|
||||
# generally not surface in before_send
|
||||
if event is not None:
|
||||
event = cast(
|
||||
"Event",
|
||||
serialize(
|
||||
cast("Dict[str, Any]", event),
|
||||
max_request_body_size=self.options.get("max_request_body_size"),
|
||||
max_value_length=self.options.get("max_value_length"),
|
||||
custom_repr=self.options.get("custom_repr"),
|
||||
),
|
||||
)
|
||||
|
||||
before_send = self.options["before_send"]
|
||||
if (
|
||||
before_send is not None
|
||||
and event is not None
|
||||
and event.get("type") != "transaction"
|
||||
):
|
||||
new_event = None
|
||||
with capture_internal_exceptions():
|
||||
new_event = before_send(event, hint or {})
|
||||
if new_event is None:
|
||||
logger.info("before send dropped event")
|
||||
if self.transport:
|
||||
self.transport.record_lost_event(
|
||||
"before_send", data_category="error"
|
||||
)
|
||||
event = new_event
|
||||
|
||||
before_send_transaction = self.options["before_send_transaction"]
|
||||
if (
|
||||
before_send_transaction is not None
|
||||
and event is not None
|
||||
and event.get("type") == "transaction"
|
||||
):
|
||||
new_event = None
|
||||
spans_before = len(cast(List[Dict[str, object]], event.get("spans", [])))
|
||||
with capture_internal_exceptions():
|
||||
new_event = before_send_transaction(event, hint or {})
|
||||
if new_event is None:
|
||||
logger.info("before send transaction dropped event")
|
||||
if self.transport:
|
||||
self.transport.record_lost_event(
|
||||
reason="before_send", data_category="transaction"
|
||||
)
|
||||
self.transport.record_lost_event(
|
||||
reason="before_send",
|
||||
data_category="span",
|
||||
quantity=spans_before + 1, # +1 for the transaction itself
|
||||
)
|
||||
else:
|
||||
spans_delta = spans_before - len(new_event.get("spans", []))
|
||||
if spans_delta > 0 and self.transport is not None:
|
||||
self.transport.record_lost_event(
|
||||
reason="before_send", data_category="span", quantity=spans_delta
|
||||
)
|
||||
|
||||
event = new_event
|
||||
|
||||
return event
|
||||
|
||||
def _is_ignored_error(self, event, hint):
|
||||
# type: (Event, Hint) -> bool
|
||||
exc_info = hint.get("exc_info")
|
||||
if exc_info is None:
|
||||
return False
|
||||
|
||||
error = exc_info[0]
|
||||
error_type_name = get_type_name(exc_info[0])
|
||||
error_full_name = "%s.%s" % (exc_info[0].__module__, error_type_name)
|
||||
|
||||
for ignored_error in self.options["ignore_errors"]:
|
||||
# String types are matched against the type name in the
|
||||
# exception only
|
||||
if isinstance(ignored_error, str):
|
||||
if ignored_error == error_full_name or ignored_error == error_type_name:
|
||||
return True
|
||||
else:
|
||||
if issubclass(error, ignored_error):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def _should_capture(
|
||||
self,
|
||||
event, # type: Event
|
||||
hint, # type: Hint
|
||||
scope=None, # type: Optional[Scope]
|
||||
):
|
||||
# type: (...) -> bool
|
||||
# Transactions are sampled independent of error events.
|
||||
is_transaction = event.get("type") == "transaction"
|
||||
if is_transaction:
|
||||
return True
|
||||
|
||||
ignoring_prevents_recursion = scope is not None and not scope._should_capture
|
||||
if ignoring_prevents_recursion:
|
||||
return False
|
||||
|
||||
ignored_by_config_option = self._is_ignored_error(event, hint)
|
||||
if ignored_by_config_option:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def _should_sample_error(
|
||||
self,
|
||||
event, # type: Event
|
||||
hint, # type: Hint
|
||||
):
|
||||
# type: (...) -> bool
|
||||
error_sampler = self.options.get("error_sampler", None)
|
||||
|
||||
if callable(error_sampler):
|
||||
with capture_internal_exceptions():
|
||||
sample_rate = error_sampler(event, hint)
|
||||
else:
|
||||
sample_rate = self.options["sample_rate"]
|
||||
|
||||
try:
|
||||
not_in_sample_rate = sample_rate < 1.0 and random.random() >= sample_rate
|
||||
except NameError:
|
||||
logger.warning(
|
||||
"The provided error_sampler raised an error. Defaulting to sampling the event."
|
||||
)
|
||||
|
||||
# If the error_sampler raised an error, we should sample the event, since the default behavior
|
||||
# (when no sample_rate or error_sampler is provided) is to sample all events.
|
||||
not_in_sample_rate = False
|
||||
except TypeError:
|
||||
parameter, verb = (
|
||||
("error_sampler", "returned")
|
||||
if callable(error_sampler)
|
||||
else ("sample_rate", "contains")
|
||||
)
|
||||
logger.warning(
|
||||
"The provided %s %s an invalid value of %s. The value should be a float or a bool. Defaulting to sampling the event."
|
||||
% (parameter, verb, repr(sample_rate))
|
||||
)
|
||||
|
||||
# If the sample_rate has an invalid value, we should sample the event, since the default behavior
|
||||
# (when no sample_rate or error_sampler is provided) is to sample all events.
|
||||
not_in_sample_rate = False
|
||||
|
||||
if not_in_sample_rate:
|
||||
# because we will not sample this event, record a "lost event".
|
||||
if self.transport:
|
||||
self.transport.record_lost_event("sample_rate", data_category="error")
|
||||
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def _update_session_from_event(
|
||||
self,
|
||||
session, # type: Session
|
||||
event, # type: Event
|
||||
):
|
||||
# type: (...) -> None
|
||||
|
||||
crashed = False
|
||||
errored = False
|
||||
user_agent = None
|
||||
|
||||
exceptions = (event.get("exception") or {}).get("values")
|
||||
if exceptions:
|
||||
errored = True
|
||||
for error in exceptions:
|
||||
mechanism = error.get("mechanism")
|
||||
if isinstance(mechanism, Mapping) and mechanism.get("handled") is False:
|
||||
crashed = True
|
||||
break
|
||||
|
||||
user = event.get("user")
|
||||
|
||||
if session.user_agent is None:
|
||||
headers = (event.get("request") or {}).get("headers")
|
||||
headers_dict = headers if isinstance(headers, dict) else {}
|
||||
for k, v in headers_dict.items():
|
||||
if k.lower() == "user-agent":
|
||||
user_agent = v
|
||||
break
|
||||
|
||||
session.update(
|
||||
status="crashed" if crashed else None,
|
||||
user=user,
|
||||
user_agent=user_agent,
|
||||
errors=session.errors + (errored or crashed),
|
||||
)
|
||||
|
||||
def capture_event(
|
||||
self,
|
||||
event, # type: Event
|
||||
hint=None, # type: Optional[Hint]
|
||||
scope=None, # type: Optional[Scope]
|
||||
):
|
||||
# type: (...) -> Optional[str]
|
||||
"""Captures an event.
|
||||
|
||||
:param event: A ready-made event that can be directly sent to Sentry.
|
||||
|
||||
:param hint: Contains metadata about the event that can be read from `before_send`, such as the original exception object or a HTTP request object.
|
||||
|
||||
:param scope: An optional :py:class:`sentry_sdk.Scope` to apply to events.
|
||||
|
||||
:returns: An event ID. May be `None` if there is no DSN set or of if the SDK decided to discard the event for other reasons. In such situations setting `debug=True` on `init()` may help.
|
||||
"""
|
||||
hint = dict(hint or ()) # type: Hint
|
||||
|
||||
if not self._should_capture(event, hint, scope):
|
||||
return None
|
||||
|
||||
profile = event.pop("profile", None)
|
||||
|
||||
event_id = event.get("event_id")
|
||||
if event_id is None:
|
||||
event["event_id"] = event_id = uuid.uuid4().hex
|
||||
event_opt = self._prepare_event(event, hint, scope)
|
||||
if event_opt is None:
|
||||
return None
|
||||
|
||||
# whenever we capture an event we also check if the session needs
|
||||
# to be updated based on that information.
|
||||
session = scope._session if scope else None
|
||||
if session:
|
||||
self._update_session_from_event(session, event)
|
||||
|
||||
is_transaction = event_opt.get("type") == "transaction"
|
||||
is_checkin = event_opt.get("type") == "check_in"
|
||||
|
||||
if (
|
||||
not is_transaction
|
||||
and not is_checkin
|
||||
and not self._should_sample_error(event, hint)
|
||||
):
|
||||
return None
|
||||
|
||||
attachments = hint.get("attachments")
|
||||
|
||||
trace_context = event_opt.get("contexts", {}).get("trace") or {}
|
||||
dynamic_sampling_context = trace_context.pop("dynamic_sampling_context", {})
|
||||
|
||||
headers = {
|
||||
"event_id": event_opt["event_id"],
|
||||
"sent_at": format_timestamp(datetime.now(timezone.utc)),
|
||||
} # type: dict[str, object]
|
||||
|
||||
if dynamic_sampling_context:
|
||||
headers["trace"] = dynamic_sampling_context
|
||||
|
||||
envelope = Envelope(headers=headers)
|
||||
|
||||
if is_transaction:
|
||||
if isinstance(profile, Profile):
|
||||
envelope.add_profile(profile.to_json(event_opt, self.options))
|
||||
envelope.add_transaction(event_opt)
|
||||
elif is_checkin:
|
||||
envelope.add_checkin(event_opt)
|
||||
else:
|
||||
envelope.add_event(event_opt)
|
||||
|
||||
for attachment in attachments or ():
|
||||
envelope.add_item(attachment.to_envelope_item())
|
||||
|
||||
return_value = None
|
||||
if self.spotlight:
|
||||
self.spotlight.capture_envelope(envelope)
|
||||
return_value = event_id
|
||||
|
||||
if self.transport is not None:
|
||||
self.transport.capture_envelope(envelope)
|
||||
return_value = event_id
|
||||
|
||||
return return_value
|
||||
|
||||
def capture_session(
|
||||
self, session # type: Session
|
||||
):
|
||||
# type: (...) -> None
|
||||
if not session.release:
|
||||
logger.info("Discarded session update because of missing release")
|
||||
else:
|
||||
self.session_flusher.add_session(session)
|
||||
|
||||
if TYPE_CHECKING:
|
||||
|
||||
@overload
|
||||
def get_integration(self, name_or_class):
|
||||
# type: (str) -> Optional[Integration]
|
||||
...
|
||||
|
||||
@overload
|
||||
def get_integration(self, name_or_class):
|
||||
# type: (type[I]) -> Optional[I]
|
||||
...
|
||||
|
||||
def get_integration(
|
||||
self, name_or_class # type: Union[str, Type[Integration]]
|
||||
):
|
||||
# type: (...) -> Optional[Integration]
|
||||
"""Returns the integration for this client by name or class.
|
||||
If the client does not have that integration then `None` is returned.
|
||||
"""
|
||||
if isinstance(name_or_class, str):
|
||||
integration_name = name_or_class
|
||||
elif name_or_class.identifier is not None:
|
||||
integration_name = name_or_class.identifier
|
||||
else:
|
||||
raise ValueError("Integration has no name")
|
||||
|
||||
return self.integrations.get(integration_name)
|
||||
|
||||
def close(
|
||||
self,
|
||||
timeout=None, # type: Optional[float]
|
||||
callback=None, # type: Optional[Callable[[int, float], None]]
|
||||
):
|
||||
# type: (...) -> None
|
||||
"""
|
||||
Close the client and shut down the transport. Arguments have the same
|
||||
semantics as :py:meth:`Client.flush`.
|
||||
"""
|
||||
if self.transport is not None:
|
||||
self.flush(timeout=timeout, callback=callback)
|
||||
self.session_flusher.kill()
|
||||
if self.metrics_aggregator is not None:
|
||||
self.metrics_aggregator.kill()
|
||||
if self.monitor:
|
||||
self.monitor.kill()
|
||||
self.transport.kill()
|
||||
self.transport = None
|
||||
|
||||
def flush(
|
||||
self,
|
||||
timeout=None, # type: Optional[float]
|
||||
callback=None, # type: Optional[Callable[[int, float], None]]
|
||||
):
|
||||
# type: (...) -> None
|
||||
"""
|
||||
Wait for the current events to be sent.
|
||||
|
||||
:param timeout: Wait for at most `timeout` seconds. If no `timeout` is provided, the `shutdown_timeout` option value is used.
|
||||
|
||||
:param callback: Is invoked with the number of pending events and the configured timeout.
|
||||
"""
|
||||
if self.transport is not None:
|
||||
if timeout is None:
|
||||
timeout = self.options["shutdown_timeout"]
|
||||
self.session_flusher.flush()
|
||||
if self.metrics_aggregator is not None:
|
||||
self.metrics_aggregator.flush()
|
||||
self.transport.flush(timeout=timeout, callback=callback)
|
||||
|
||||
def __enter__(self):
|
||||
# type: () -> _Client
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_value, tb):
|
||||
# type: (Any, Any, Any) -> None
|
||||
self.close()
|
||||
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
# Make mypy, PyCharm and other static analyzers think `get_options` is a
|
||||
# type to have nicer autocompletion for params.
|
||||
#
|
||||
# Use `ClientConstructor` to define the argument types of `init` and
|
||||
# `Dict[str, Any]` to tell static analyzers about the return type.
|
||||
|
||||
class get_options(ClientConstructor, Dict[str, Any]): # noqa: N801
|
||||
pass
|
||||
|
||||
class Client(ClientConstructor, _Client):
|
||||
pass
|
||||
|
||||
else:
|
||||
# Alias `get_options` for actual usage. Go through the lambda indirection
|
||||
# to throw PyCharm off of the weakly typed signature (it would otherwise
|
||||
# discover both the weakly typed signature of `_init` and our faked `init`
|
||||
# type).
|
||||
|
||||
get_options = (lambda: _get_options)()
|
||||
Client = (lambda: _Client)()
|
||||
@@ -0,0 +1,587 @@
|
||||
import itertools
|
||||
|
||||
from enum import Enum
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
# up top to prevent circular import due to integration import
|
||||
DEFAULT_MAX_VALUE_LENGTH = 1024
|
||||
|
||||
DEFAULT_MAX_STACK_FRAMES = 100
|
||||
DEFAULT_ADD_FULL_STACK = False
|
||||
|
||||
|
||||
# Also needs to be at the top to prevent circular import
|
||||
class EndpointType(Enum):
|
||||
"""
|
||||
The type of an endpoint. This is an enum, rather than a constant, for historical reasons
|
||||
(the old /store endpoint). The enum also preserve future compatibility, in case we ever
|
||||
have a new endpoint.
|
||||
"""
|
||||
|
||||
ENVELOPE = "envelope"
|
||||
|
||||
|
||||
class CompressionAlgo(Enum):
|
||||
GZIP = "gzip"
|
||||
BROTLI = "br"
|
||||
|
||||
|
||||
if TYPE_CHECKING:
|
||||
import sentry_sdk
|
||||
|
||||
from typing import Optional
|
||||
from typing import Callable
|
||||
from typing import Union
|
||||
from typing import List
|
||||
from typing import Type
|
||||
from typing import Dict
|
||||
from typing import Any
|
||||
from typing import Sequence
|
||||
from typing import Tuple
|
||||
from typing_extensions import Literal
|
||||
from typing_extensions import TypedDict
|
||||
|
||||
from sentry_sdk._types import (
|
||||
BreadcrumbProcessor,
|
||||
ContinuousProfilerMode,
|
||||
Event,
|
||||
EventProcessor,
|
||||
Hint,
|
||||
MeasurementUnit,
|
||||
ProfilerMode,
|
||||
TracesSampler,
|
||||
TransactionProcessor,
|
||||
MetricTags,
|
||||
MetricValue,
|
||||
)
|
||||
|
||||
# Experiments are feature flags to enable and disable certain unstable SDK
|
||||
# functionality. Changing them from the defaults (`None`) in production
|
||||
# code is highly discouraged. They are not subject to any stability
|
||||
# guarantees such as the ones from semantic versioning.
|
||||
Experiments = TypedDict(
|
||||
"Experiments",
|
||||
{
|
||||
"max_spans": Optional[int],
|
||||
"max_flags": Optional[int],
|
||||
"record_sql_params": Optional[bool],
|
||||
"continuous_profiling_auto_start": Optional[bool],
|
||||
"continuous_profiling_mode": Optional[ContinuousProfilerMode],
|
||||
"otel_powered_performance": Optional[bool],
|
||||
"transport_zlib_compression_level": Optional[int],
|
||||
"transport_compression_level": Optional[int],
|
||||
"transport_compression_algo": Optional[CompressionAlgo],
|
||||
"transport_num_pools": Optional[int],
|
||||
"transport_http2": Optional[bool],
|
||||
"enable_metrics": Optional[bool],
|
||||
"before_emit_metric": Optional[
|
||||
Callable[[str, MetricValue, MeasurementUnit, MetricTags], bool]
|
||||
],
|
||||
"metric_code_locations": Optional[bool],
|
||||
},
|
||||
total=False,
|
||||
)
|
||||
|
||||
DEFAULT_QUEUE_SIZE = 100
|
||||
DEFAULT_MAX_BREADCRUMBS = 100
|
||||
MATCH_ALL = r".*"
|
||||
|
||||
FALSE_VALUES = [
|
||||
"false",
|
||||
"no",
|
||||
"off",
|
||||
"n",
|
||||
"0",
|
||||
]
|
||||
|
||||
|
||||
class INSTRUMENTER:
|
||||
SENTRY = "sentry"
|
||||
OTEL = "otel"
|
||||
|
||||
|
||||
class SPANDATA:
|
||||
"""
|
||||
Additional information describing the type of the span.
|
||||
See: https://develop.sentry.dev/sdk/performance/span-data-conventions/
|
||||
"""
|
||||
|
||||
AI_FREQUENCY_PENALTY = "ai.frequency_penalty"
|
||||
"""
|
||||
Used to reduce repetitiveness of generated tokens.
|
||||
Example: 0.5
|
||||
"""
|
||||
|
||||
AI_PRESENCE_PENALTY = "ai.presence_penalty"
|
||||
"""
|
||||
Used to reduce repetitiveness of generated tokens.
|
||||
Example: 0.5
|
||||
"""
|
||||
|
||||
AI_INPUT_MESSAGES = "ai.input_messages"
|
||||
"""
|
||||
The input messages to an LLM call.
|
||||
Example: [{"role": "user", "message": "hello"}]
|
||||
"""
|
||||
|
||||
AI_MODEL_ID = "ai.model_id"
|
||||
"""
|
||||
The unique descriptor of the model being execugted
|
||||
Example: gpt-4
|
||||
"""
|
||||
|
||||
AI_METADATA = "ai.metadata"
|
||||
"""
|
||||
Extra metadata passed to an AI pipeline step.
|
||||
Example: {"executed_function": "add_integers"}
|
||||
"""
|
||||
|
||||
AI_TAGS = "ai.tags"
|
||||
"""
|
||||
Tags that describe an AI pipeline step.
|
||||
Example: {"executed_function": "add_integers"}
|
||||
"""
|
||||
|
||||
AI_STREAMING = "ai.streaming"
|
||||
"""
|
||||
Whether or not the AI model call's repsonse was streamed back asynchronously
|
||||
Example: true
|
||||
"""
|
||||
|
||||
AI_TEMPERATURE = "ai.temperature"
|
||||
"""
|
||||
For an AI model call, the temperature parameter. Temperature essentially means how random the output will be.
|
||||
Example: 0.5
|
||||
"""
|
||||
|
||||
AI_TOP_P = "ai.top_p"
|
||||
"""
|
||||
For an AI model call, the top_p parameter. Top_p essentially controls how random the output will be.
|
||||
Example: 0.5
|
||||
"""
|
||||
|
||||
AI_TOP_K = "ai.top_k"
|
||||
"""
|
||||
For an AI model call, the top_k parameter. Top_k essentially controls how random the output will be.
|
||||
Example: 35
|
||||
"""
|
||||
|
||||
AI_FUNCTION_CALL = "ai.function_call"
|
||||
"""
|
||||
For an AI model call, the function that was called. This is deprecated for OpenAI, and replaced by tool_calls
|
||||
"""
|
||||
|
||||
AI_TOOL_CALLS = "ai.tool_calls"
|
||||
"""
|
||||
For an AI model call, the function that was called. This is deprecated for OpenAI, and replaced by tool_calls
|
||||
"""
|
||||
|
||||
AI_TOOLS = "ai.tools"
|
||||
"""
|
||||
For an AI model call, the functions that are available
|
||||
"""
|
||||
|
||||
AI_RESPONSE_FORMAT = "ai.response_format"
|
||||
"""
|
||||
For an AI model call, the format of the response
|
||||
"""
|
||||
|
||||
AI_LOGIT_BIAS = "ai.response_format"
|
||||
"""
|
||||
For an AI model call, the logit bias
|
||||
"""
|
||||
|
||||
AI_PREAMBLE = "ai.preamble"
|
||||
"""
|
||||
For an AI model call, the preamble parameter.
|
||||
Preambles are a part of the prompt used to adjust the model's overall behavior and conversation style.
|
||||
Example: "You are now a clown."
|
||||
"""
|
||||
|
||||
AI_RAW_PROMPTING = "ai.raw_prompting"
|
||||
"""
|
||||
Minimize pre-processing done to the prompt sent to the LLM.
|
||||
Example: true
|
||||
"""
|
||||
|
||||
AI_RESPONSES = "ai.responses"
|
||||
"""
|
||||
The responses to an AI model call. Always as a list.
|
||||
Example: ["hello", "world"]
|
||||
"""
|
||||
|
||||
AI_SEED = "ai.seed"
|
||||
"""
|
||||
The seed, ideally models given the same seed and same other parameters will produce the exact same output.
|
||||
Example: 123.45
|
||||
"""
|
||||
|
||||
DB_NAME = "db.name"
|
||||
"""
|
||||
The name of the database being accessed. For commands that switch the database, this should be set to the target database (even if the command fails).
|
||||
Example: myDatabase
|
||||
"""
|
||||
|
||||
DB_USER = "db.user"
|
||||
"""
|
||||
The name of the database user used for connecting to the database.
|
||||
See: https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/trace/semantic_conventions/database.md
|
||||
Example: my_user
|
||||
"""
|
||||
|
||||
DB_OPERATION = "db.operation"
|
||||
"""
|
||||
The name of the operation being executed, e.g. the MongoDB command name such as findAndModify, or the SQL keyword.
|
||||
See: https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/trace/semantic_conventions/database.md
|
||||
Example: findAndModify, HMSET, SELECT
|
||||
"""
|
||||
|
||||
DB_SYSTEM = "db.system"
|
||||
"""
|
||||
An identifier for the database management system (DBMS) product being used.
|
||||
See: https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/trace/semantic_conventions/database.md
|
||||
Example: postgresql
|
||||
"""
|
||||
|
||||
DB_MONGODB_COLLECTION = "db.mongodb.collection"
|
||||
"""
|
||||
The MongoDB collection being accessed within the database.
|
||||
See: https://github.com/open-telemetry/semantic-conventions/blob/main/docs/database/mongodb.md#attributes
|
||||
Example: public.users; customers
|
||||
"""
|
||||
|
||||
CACHE_HIT = "cache.hit"
|
||||
"""
|
||||
A boolean indicating whether the requested data was found in the cache.
|
||||
Example: true
|
||||
"""
|
||||
|
||||
CACHE_ITEM_SIZE = "cache.item_size"
|
||||
"""
|
||||
The size of the requested data in bytes.
|
||||
Example: 58
|
||||
"""
|
||||
|
||||
CACHE_KEY = "cache.key"
|
||||
"""
|
||||
The key of the requested data.
|
||||
Example: template.cache.some_item.867da7e2af8e6b2f3aa7213a4080edb3
|
||||
"""
|
||||
|
||||
NETWORK_PEER_ADDRESS = "network.peer.address"
|
||||
"""
|
||||
Peer address of the network connection - IP address or Unix domain socket name.
|
||||
Example: 10.1.2.80, /tmp/my.sock, localhost
|
||||
"""
|
||||
|
||||
NETWORK_PEER_PORT = "network.peer.port"
|
||||
"""
|
||||
Peer port number of the network connection.
|
||||
Example: 6379
|
||||
"""
|
||||
|
||||
HTTP_QUERY = "http.query"
|
||||
"""
|
||||
The Query string present in the URL.
|
||||
Example: ?foo=bar&bar=baz
|
||||
"""
|
||||
|
||||
HTTP_FRAGMENT = "http.fragment"
|
||||
"""
|
||||
The Fragments present in the URL.
|
||||
Example: #foo=bar
|
||||
"""
|
||||
|
||||
HTTP_METHOD = "http.method"
|
||||
"""
|
||||
The HTTP method used.
|
||||
Example: GET
|
||||
"""
|
||||
|
||||
HTTP_STATUS_CODE = "http.response.status_code"
|
||||
"""
|
||||
The HTTP status code as an integer.
|
||||
Example: 418
|
||||
"""
|
||||
|
||||
MESSAGING_DESTINATION_NAME = "messaging.destination.name"
|
||||
"""
|
||||
The destination name where the message is being consumed from,
|
||||
e.g. the queue name or topic.
|
||||
"""
|
||||
|
||||
MESSAGING_MESSAGE_ID = "messaging.message.id"
|
||||
"""
|
||||
The message's identifier.
|
||||
"""
|
||||
|
||||
MESSAGING_MESSAGE_RETRY_COUNT = "messaging.message.retry.count"
|
||||
"""
|
||||
Number of retries/attempts to process a message.
|
||||
"""
|
||||
|
||||
MESSAGING_MESSAGE_RECEIVE_LATENCY = "messaging.message.receive.latency"
|
||||
"""
|
||||
The latency between when the task was enqueued and when it was started to be processed.
|
||||
"""
|
||||
|
||||
MESSAGING_SYSTEM = "messaging.system"
|
||||
"""
|
||||
The messaging system's name, e.g. `kafka`, `aws_sqs`
|
||||
"""
|
||||
|
||||
SERVER_ADDRESS = "server.address"
|
||||
"""
|
||||
Name of the database host.
|
||||
Example: example.com
|
||||
"""
|
||||
|
||||
SERVER_PORT = "server.port"
|
||||
"""
|
||||
Logical server port number
|
||||
Example: 80; 8080; 443
|
||||
"""
|
||||
|
||||
SERVER_SOCKET_ADDRESS = "server.socket.address"
|
||||
"""
|
||||
Physical server IP address or Unix socket address.
|
||||
Example: 10.5.3.2
|
||||
"""
|
||||
|
||||
SERVER_SOCKET_PORT = "server.socket.port"
|
||||
"""
|
||||
Physical server port.
|
||||
Recommended: If different than server.port.
|
||||
Example: 16456
|
||||
"""
|
||||
|
||||
CODE_FILEPATH = "code.filepath"
|
||||
"""
|
||||
The source code file name that identifies the code unit as uniquely as possible (preferably an absolute file path).
|
||||
Example: "/app/myapplication/http/handler/server.py"
|
||||
"""
|
||||
|
||||
CODE_LINENO = "code.lineno"
|
||||
"""
|
||||
The line number in `code.filepath` best representing the operation. It SHOULD point within the code unit named in `code.function`.
|
||||
Example: 42
|
||||
"""
|
||||
|
||||
CODE_FUNCTION = "code.function"
|
||||
"""
|
||||
The method or function name, or equivalent (usually rightmost part of the code unit's name).
|
||||
Example: "server_request"
|
||||
"""
|
||||
|
||||
CODE_NAMESPACE = "code.namespace"
|
||||
"""
|
||||
The "namespace" within which `code.function` is defined. Usually the qualified class or module name, such that `code.namespace` + some separator + `code.function` form a unique identifier for the code unit.
|
||||
Example: "http.handler"
|
||||
"""
|
||||
|
||||
THREAD_ID = "thread.id"
|
||||
"""
|
||||
Identifier of a thread from where the span originated. This should be a string.
|
||||
Example: "7972576320"
|
||||
"""
|
||||
|
||||
THREAD_NAME = "thread.name"
|
||||
"""
|
||||
Label identifying a thread from where the span originated. This should be a string.
|
||||
Example: "MainThread"
|
||||
"""
|
||||
|
||||
PROFILER_ID = "profiler_id"
|
||||
"""
|
||||
Label identifying the profiler id that the span occurred in. This should be a string.
|
||||
Example: "5249fbada8d5416482c2f6e47e337372"
|
||||
"""
|
||||
|
||||
|
||||
class SPANSTATUS:
|
||||
"""
|
||||
The status of a Sentry span.
|
||||
|
||||
See: https://develop.sentry.dev/sdk/event-payloads/contexts/#trace-context
|
||||
"""
|
||||
|
||||
ABORTED = "aborted"
|
||||
ALREADY_EXISTS = "already_exists"
|
||||
CANCELLED = "cancelled"
|
||||
DATA_LOSS = "data_loss"
|
||||
DEADLINE_EXCEEDED = "deadline_exceeded"
|
||||
FAILED_PRECONDITION = "failed_precondition"
|
||||
INTERNAL_ERROR = "internal_error"
|
||||
INVALID_ARGUMENT = "invalid_argument"
|
||||
NOT_FOUND = "not_found"
|
||||
OK = "ok"
|
||||
OUT_OF_RANGE = "out_of_range"
|
||||
PERMISSION_DENIED = "permission_denied"
|
||||
RESOURCE_EXHAUSTED = "resource_exhausted"
|
||||
UNAUTHENTICATED = "unauthenticated"
|
||||
UNAVAILABLE = "unavailable"
|
||||
UNIMPLEMENTED = "unimplemented"
|
||||
UNKNOWN_ERROR = "unknown_error"
|
||||
|
||||
|
||||
class OP:
|
||||
ANTHROPIC_MESSAGES_CREATE = "ai.messages.create.anthropic"
|
||||
CACHE_GET = "cache.get"
|
||||
CACHE_PUT = "cache.put"
|
||||
COHERE_CHAT_COMPLETIONS_CREATE = "ai.chat_completions.create.cohere"
|
||||
COHERE_EMBEDDINGS_CREATE = "ai.embeddings.create.cohere"
|
||||
DB = "db"
|
||||
DB_REDIS = "db.redis"
|
||||
EVENT_DJANGO = "event.django"
|
||||
FUNCTION = "function"
|
||||
FUNCTION_AWS = "function.aws"
|
||||
FUNCTION_GCP = "function.gcp"
|
||||
GRAPHQL_EXECUTE = "graphql.execute"
|
||||
GRAPHQL_MUTATION = "graphql.mutation"
|
||||
GRAPHQL_PARSE = "graphql.parse"
|
||||
GRAPHQL_RESOLVE = "graphql.resolve"
|
||||
GRAPHQL_SUBSCRIPTION = "graphql.subscription"
|
||||
GRAPHQL_QUERY = "graphql.query"
|
||||
GRAPHQL_VALIDATE = "graphql.validate"
|
||||
GRPC_CLIENT = "grpc.client"
|
||||
GRPC_SERVER = "grpc.server"
|
||||
HTTP_CLIENT = "http.client"
|
||||
HTTP_CLIENT_STREAM = "http.client.stream"
|
||||
HTTP_SERVER = "http.server"
|
||||
MIDDLEWARE_DJANGO = "middleware.django"
|
||||
MIDDLEWARE_LITESTAR = "middleware.litestar"
|
||||
MIDDLEWARE_LITESTAR_RECEIVE = "middleware.litestar.receive"
|
||||
MIDDLEWARE_LITESTAR_SEND = "middleware.litestar.send"
|
||||
MIDDLEWARE_STARLETTE = "middleware.starlette"
|
||||
MIDDLEWARE_STARLETTE_RECEIVE = "middleware.starlette.receive"
|
||||
MIDDLEWARE_STARLETTE_SEND = "middleware.starlette.send"
|
||||
MIDDLEWARE_STARLITE = "middleware.starlite"
|
||||
MIDDLEWARE_STARLITE_RECEIVE = "middleware.starlite.receive"
|
||||
MIDDLEWARE_STARLITE_SEND = "middleware.starlite.send"
|
||||
OPENAI_CHAT_COMPLETIONS_CREATE = "ai.chat_completions.create.openai"
|
||||
OPENAI_EMBEDDINGS_CREATE = "ai.embeddings.create.openai"
|
||||
HUGGINGFACE_HUB_CHAT_COMPLETIONS_CREATE = (
|
||||
"ai.chat_completions.create.huggingface_hub"
|
||||
)
|
||||
LANGCHAIN_PIPELINE = "ai.pipeline.langchain"
|
||||
LANGCHAIN_RUN = "ai.run.langchain"
|
||||
LANGCHAIN_TOOL = "ai.tool.langchain"
|
||||
LANGCHAIN_AGENT = "ai.agent.langchain"
|
||||
LANGCHAIN_CHAT_COMPLETIONS_CREATE = "ai.chat_completions.create.langchain"
|
||||
QUEUE_PROCESS = "queue.process"
|
||||
QUEUE_PUBLISH = "queue.publish"
|
||||
QUEUE_SUBMIT_ARQ = "queue.submit.arq"
|
||||
QUEUE_TASK_ARQ = "queue.task.arq"
|
||||
QUEUE_SUBMIT_CELERY = "queue.submit.celery"
|
||||
QUEUE_TASK_CELERY = "queue.task.celery"
|
||||
QUEUE_TASK_RQ = "queue.task.rq"
|
||||
QUEUE_SUBMIT_HUEY = "queue.submit.huey"
|
||||
QUEUE_TASK_HUEY = "queue.task.huey"
|
||||
QUEUE_SUBMIT_RAY = "queue.submit.ray"
|
||||
QUEUE_TASK_RAY = "queue.task.ray"
|
||||
SUBPROCESS = "subprocess"
|
||||
SUBPROCESS_WAIT = "subprocess.wait"
|
||||
SUBPROCESS_COMMUNICATE = "subprocess.communicate"
|
||||
TEMPLATE_RENDER = "template.render"
|
||||
VIEW_RENDER = "view.render"
|
||||
VIEW_RESPONSE_RENDER = "view.response.render"
|
||||
WEBSOCKET_SERVER = "websocket.server"
|
||||
SOCKET_CONNECTION = "socket.connection"
|
||||
SOCKET_DNS = "socket.dns"
|
||||
|
||||
|
||||
# This type exists to trick mypy and PyCharm into thinking `init` and `Client`
|
||||
# take these arguments (even though they take opaque **kwargs)
|
||||
class ClientConstructor:
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
dsn=None, # type: Optional[str]
|
||||
*,
|
||||
max_breadcrumbs=DEFAULT_MAX_BREADCRUMBS, # type: int
|
||||
release=None, # type: Optional[str]
|
||||
environment=None, # type: Optional[str]
|
||||
server_name=None, # type: Optional[str]
|
||||
shutdown_timeout=2, # type: float
|
||||
integrations=[], # type: Sequence[sentry_sdk.integrations.Integration] # noqa: B006
|
||||
in_app_include=[], # type: List[str] # noqa: B006
|
||||
in_app_exclude=[], # type: List[str] # noqa: B006
|
||||
default_integrations=True, # type: bool
|
||||
dist=None, # type: Optional[str]
|
||||
transport=None, # type: Optional[Union[sentry_sdk.transport.Transport, Type[sentry_sdk.transport.Transport], Callable[[Event], None]]]
|
||||
transport_queue_size=DEFAULT_QUEUE_SIZE, # type: int
|
||||
sample_rate=1.0, # type: float
|
||||
send_default_pii=None, # type: Optional[bool]
|
||||
http_proxy=None, # type: Optional[str]
|
||||
https_proxy=None, # type: Optional[str]
|
||||
ignore_errors=[], # type: Sequence[Union[type, str]] # noqa: B006
|
||||
max_request_body_size="medium", # type: str
|
||||
socket_options=None, # type: Optional[List[Tuple[int, int, int | bytes]]]
|
||||
keep_alive=False, # type: bool
|
||||
before_send=None, # type: Optional[EventProcessor]
|
||||
before_breadcrumb=None, # type: Optional[BreadcrumbProcessor]
|
||||
debug=None, # type: Optional[bool]
|
||||
attach_stacktrace=False, # type: bool
|
||||
ca_certs=None, # type: Optional[str]
|
||||
propagate_traces=True, # type: bool
|
||||
traces_sample_rate=None, # type: Optional[float]
|
||||
traces_sampler=None, # type: Optional[TracesSampler]
|
||||
profiles_sample_rate=None, # type: Optional[float]
|
||||
profiles_sampler=None, # type: Optional[TracesSampler]
|
||||
profiler_mode=None, # type: Optional[ProfilerMode]
|
||||
profile_lifecycle="manual", # type: Literal["manual", "trace"]
|
||||
profile_session_sample_rate=None, # type: Optional[float]
|
||||
auto_enabling_integrations=True, # type: bool
|
||||
disabled_integrations=None, # type: Optional[Sequence[sentry_sdk.integrations.Integration]]
|
||||
auto_session_tracking=True, # type: bool
|
||||
send_client_reports=True, # type: bool
|
||||
_experiments={}, # type: Experiments # noqa: B006
|
||||
proxy_headers=None, # type: Optional[Dict[str, str]]
|
||||
instrumenter=INSTRUMENTER.SENTRY, # type: Optional[str]
|
||||
before_send_transaction=None, # type: Optional[TransactionProcessor]
|
||||
project_root=None, # type: Optional[str]
|
||||
enable_tracing=None, # type: Optional[bool]
|
||||
include_local_variables=True, # type: Optional[bool]
|
||||
include_source_context=True, # type: Optional[bool]
|
||||
trace_propagation_targets=[ # noqa: B006
|
||||
MATCH_ALL
|
||||
], # type: Optional[Sequence[str]]
|
||||
functions_to_trace=[], # type: Sequence[Dict[str, str]] # noqa: B006
|
||||
event_scrubber=None, # type: Optional[sentry_sdk.scrubber.EventScrubber]
|
||||
max_value_length=DEFAULT_MAX_VALUE_LENGTH, # type: int
|
||||
enable_backpressure_handling=True, # type: bool
|
||||
error_sampler=None, # type: Optional[Callable[[Event, Hint], Union[float, bool]]]
|
||||
enable_db_query_source=True, # type: bool
|
||||
db_query_source_threshold_ms=100, # type: int
|
||||
spotlight=None, # type: Optional[Union[bool, str]]
|
||||
cert_file=None, # type: Optional[str]
|
||||
key_file=None, # type: Optional[str]
|
||||
custom_repr=None, # type: Optional[Callable[..., Optional[str]]]
|
||||
add_full_stack=DEFAULT_ADD_FULL_STACK, # type: bool
|
||||
max_stack_frames=DEFAULT_MAX_STACK_FRAMES, # type: Optional[int]
|
||||
):
|
||||
# type: (...) -> None
|
||||
pass
|
||||
|
||||
|
||||
def _get_default_options():
|
||||
# type: () -> dict[str, Any]
|
||||
import inspect
|
||||
|
||||
a = inspect.getfullargspec(ClientConstructor.__init__)
|
||||
defaults = a.defaults or ()
|
||||
kwonlydefaults = a.kwonlydefaults or {}
|
||||
|
||||
return dict(
|
||||
itertools.chain(
|
||||
zip(a.args[-len(defaults) :], defaults),
|
||||
kwonlydefaults.items(),
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
DEFAULT_OPTIONS = _get_default_options()
|
||||
del _get_default_options
|
||||
|
||||
|
||||
VERSION = "2.22.0"
|
||||
@@ -0,0 +1,10 @@
|
||||
from sentry_sdk.crons.api import capture_checkin
|
||||
from sentry_sdk.crons.consts import MonitorStatus
|
||||
from sentry_sdk.crons.decorator import monitor
|
||||
|
||||
|
||||
__all__ = [
|
||||
"capture_checkin",
|
||||
"MonitorStatus",
|
||||
"monitor",
|
||||
]
|
||||
@@ -0,0 +1,57 @@
|
||||
import uuid
|
||||
|
||||
import sentry_sdk
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Optional
|
||||
from sentry_sdk._types import Event, MonitorConfig
|
||||
|
||||
|
||||
def _create_check_in_event(
|
||||
monitor_slug=None, # type: Optional[str]
|
||||
check_in_id=None, # type: Optional[str]
|
||||
status=None, # type: Optional[str]
|
||||
duration_s=None, # type: Optional[float]
|
||||
monitor_config=None, # type: Optional[MonitorConfig]
|
||||
):
|
||||
# type: (...) -> Event
|
||||
options = sentry_sdk.get_client().options
|
||||
check_in_id = check_in_id or uuid.uuid4().hex # type: str
|
||||
|
||||
check_in = {
|
||||
"type": "check_in",
|
||||
"monitor_slug": monitor_slug,
|
||||
"check_in_id": check_in_id,
|
||||
"status": status,
|
||||
"duration": duration_s,
|
||||
"environment": options.get("environment", None),
|
||||
"release": options.get("release", None),
|
||||
} # type: Event
|
||||
|
||||
if monitor_config:
|
||||
check_in["monitor_config"] = monitor_config
|
||||
|
||||
return check_in
|
||||
|
||||
|
||||
def capture_checkin(
|
||||
monitor_slug=None, # type: Optional[str]
|
||||
check_in_id=None, # type: Optional[str]
|
||||
status=None, # type: Optional[str]
|
||||
duration=None, # type: Optional[float]
|
||||
monitor_config=None, # type: Optional[MonitorConfig]
|
||||
):
|
||||
# type: (...) -> str
|
||||
check_in_event = _create_check_in_event(
|
||||
monitor_slug=monitor_slug,
|
||||
check_in_id=check_in_id,
|
||||
status=status,
|
||||
duration_s=duration,
|
||||
monitor_config=monitor_config,
|
||||
)
|
||||
|
||||
sentry_sdk.capture_event(check_in_event)
|
||||
|
||||
return check_in_event["check_in_id"]
|
||||
@@ -0,0 +1,4 @@
|
||||
class MonitorStatus:
|
||||
IN_PROGRESS = "in_progress"
|
||||
OK = "ok"
|
||||
ERROR = "error"
|
||||
@@ -0,0 +1,135 @@
|
||||
from functools import wraps
|
||||
from inspect import iscoroutinefunction
|
||||
|
||||
from sentry_sdk.crons import capture_checkin
|
||||
from sentry_sdk.crons.consts import MonitorStatus
|
||||
from sentry_sdk.utils import now
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import Awaitable, Callable
|
||||
from types import TracebackType
|
||||
from typing import (
|
||||
Any,
|
||||
Optional,
|
||||
ParamSpec,
|
||||
Type,
|
||||
TypeVar,
|
||||
Union,
|
||||
cast,
|
||||
overload,
|
||||
)
|
||||
from sentry_sdk._types import MonitorConfig
|
||||
|
||||
P = ParamSpec("P")
|
||||
R = TypeVar("R")
|
||||
|
||||
|
||||
class monitor: # noqa: N801
|
||||
"""
|
||||
Decorator/context manager to capture checkin events for a monitor.
|
||||
|
||||
Usage (as decorator):
|
||||
```
|
||||
import sentry_sdk
|
||||
|
||||
app = Celery()
|
||||
|
||||
@app.task
|
||||
@sentry_sdk.monitor(monitor_slug='my-fancy-slug')
|
||||
def test(arg):
|
||||
print(arg)
|
||||
```
|
||||
|
||||
This does not have to be used with Celery, but if you do use it with celery,
|
||||
put the `@sentry_sdk.monitor` decorator below Celery's `@app.task` decorator.
|
||||
|
||||
Usage (as context manager):
|
||||
```
|
||||
import sentry_sdk
|
||||
|
||||
def test(arg):
|
||||
with sentry_sdk.monitor(monitor_slug='my-fancy-slug'):
|
||||
print(arg)
|
||||
```
|
||||
"""
|
||||
|
||||
def __init__(self, monitor_slug=None, monitor_config=None):
|
||||
# type: (Optional[str], Optional[MonitorConfig]) -> None
|
||||
self.monitor_slug = monitor_slug
|
||||
self.monitor_config = monitor_config
|
||||
|
||||
def __enter__(self):
|
||||
# type: () -> None
|
||||
self.start_timestamp = now()
|
||||
self.check_in_id = capture_checkin(
|
||||
monitor_slug=self.monitor_slug,
|
||||
status=MonitorStatus.IN_PROGRESS,
|
||||
monitor_config=self.monitor_config,
|
||||
)
|
||||
|
||||
def __exit__(self, exc_type, exc_value, traceback):
|
||||
# type: (Optional[Type[BaseException]], Optional[BaseException], Optional[TracebackType]) -> None
|
||||
duration_s = now() - self.start_timestamp
|
||||
|
||||
if exc_type is None and exc_value is None and traceback is None:
|
||||
status = MonitorStatus.OK
|
||||
else:
|
||||
status = MonitorStatus.ERROR
|
||||
|
||||
capture_checkin(
|
||||
monitor_slug=self.monitor_slug,
|
||||
check_in_id=self.check_in_id,
|
||||
status=status,
|
||||
duration=duration_s,
|
||||
monitor_config=self.monitor_config,
|
||||
)
|
||||
|
||||
if TYPE_CHECKING:
|
||||
|
||||
@overload
|
||||
def __call__(self, fn):
|
||||
# type: (Callable[P, Awaitable[Any]]) -> Callable[P, Awaitable[Any]]
|
||||
# Unfortunately, mypy does not give us any reliable way to type check the
|
||||
# return value of an Awaitable (i.e. async function) for this overload,
|
||||
# since calling iscouroutinefunction narrows the type to Callable[P, Awaitable[Any]].
|
||||
...
|
||||
|
||||
@overload
|
||||
def __call__(self, fn):
|
||||
# type: (Callable[P, R]) -> Callable[P, R]
|
||||
...
|
||||
|
||||
def __call__(
|
||||
self,
|
||||
fn, # type: Union[Callable[P, R], Callable[P, Awaitable[Any]]]
|
||||
):
|
||||
# type: (...) -> Union[Callable[P, R], Callable[P, Awaitable[Any]]]
|
||||
if iscoroutinefunction(fn):
|
||||
return self._async_wrapper(fn)
|
||||
|
||||
else:
|
||||
if TYPE_CHECKING:
|
||||
fn = cast("Callable[P, R]", fn)
|
||||
return self._sync_wrapper(fn)
|
||||
|
||||
def _async_wrapper(self, fn):
|
||||
# type: (Callable[P, Awaitable[Any]]) -> Callable[P, Awaitable[Any]]
|
||||
@wraps(fn)
|
||||
async def inner(*args: "P.args", **kwargs: "P.kwargs"):
|
||||
# type: (...) -> R
|
||||
with self:
|
||||
return await fn(*args, **kwargs)
|
||||
|
||||
return inner
|
||||
|
||||
def _sync_wrapper(self, fn):
|
||||
# type: (Callable[P, R]) -> Callable[P, R]
|
||||
@wraps(fn)
|
||||
def inner(*args: "P.args", **kwargs: "P.kwargs"):
|
||||
# type: (...) -> R
|
||||
with self:
|
||||
return fn(*args, **kwargs)
|
||||
|
||||
return inner
|
||||
@@ -0,0 +1,41 @@
|
||||
import sys
|
||||
import logging
|
||||
import warnings
|
||||
|
||||
from sentry_sdk import get_client
|
||||
from sentry_sdk.client import _client_init_debug
|
||||
from sentry_sdk.utils import logger
|
||||
from logging import LogRecord
|
||||
|
||||
|
||||
class _DebugFilter(logging.Filter):
|
||||
def filter(self, record):
|
||||
# type: (LogRecord) -> bool
|
||||
if _client_init_debug.get(False):
|
||||
return True
|
||||
|
||||
return get_client().options["debug"]
|
||||
|
||||
|
||||
def init_debug_support():
|
||||
# type: () -> None
|
||||
if not logger.handlers:
|
||||
configure_logger()
|
||||
|
||||
|
||||
def configure_logger():
|
||||
# type: () -> None
|
||||
_handler = logging.StreamHandler(sys.stderr)
|
||||
_handler.setFormatter(logging.Formatter(" [sentry] %(levelname)s: %(message)s"))
|
||||
logger.addHandler(_handler)
|
||||
logger.setLevel(logging.DEBUG)
|
||||
logger.addFilter(_DebugFilter())
|
||||
|
||||
|
||||
def configure_debug_hub():
|
||||
# type: () -> None
|
||||
warnings.warn(
|
||||
"configure_debug_hub is deprecated. Please remove calls to it, as it is a no-op.",
|
||||
DeprecationWarning,
|
||||
stacklevel=2,
|
||||
)
|
||||
@@ -0,0 +1,349 @@
|
||||
import io
|
||||
import json
|
||||
import mimetypes
|
||||
|
||||
from sentry_sdk.session import Session
|
||||
from sentry_sdk.utils import json_dumps, capture_internal_exceptions
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Any
|
||||
from typing import Optional
|
||||
from typing import Union
|
||||
from typing import Dict
|
||||
from typing import List
|
||||
from typing import Iterator
|
||||
|
||||
from sentry_sdk._types import Event, EventDataCategory
|
||||
|
||||
|
||||
def parse_json(data):
|
||||
# type: (Union[bytes, str]) -> Any
|
||||
# on some python 3 versions this needs to be bytes
|
||||
if isinstance(data, bytes):
|
||||
data = data.decode("utf-8", "replace")
|
||||
return json.loads(data)
|
||||
|
||||
|
||||
class Envelope:
|
||||
"""
|
||||
Represents a Sentry Envelope. The calling code is responsible for adhering to the constraints
|
||||
documented in the Sentry docs: https://develop.sentry.dev/sdk/envelopes/#data-model. In particular,
|
||||
each envelope may have at most one Item with type "event" or "transaction" (but not both).
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
headers=None, # type: Optional[Dict[str, Any]]
|
||||
items=None, # type: Optional[List[Item]]
|
||||
):
|
||||
# type: (...) -> None
|
||||
if headers is not None:
|
||||
headers = dict(headers)
|
||||
self.headers = headers or {}
|
||||
if items is None:
|
||||
items = []
|
||||
else:
|
||||
items = list(items)
|
||||
self.items = items
|
||||
|
||||
@property
|
||||
def description(self):
|
||||
# type: (...) -> str
|
||||
return "envelope with %s items (%s)" % (
|
||||
len(self.items),
|
||||
", ".join(x.data_category for x in self.items),
|
||||
)
|
||||
|
||||
def add_event(
|
||||
self, event # type: Event
|
||||
):
|
||||
# type: (...) -> None
|
||||
self.add_item(Item(payload=PayloadRef(json=event), type="event"))
|
||||
|
||||
def add_transaction(
|
||||
self, transaction # type: Event
|
||||
):
|
||||
# type: (...) -> None
|
||||
self.add_item(Item(payload=PayloadRef(json=transaction), type="transaction"))
|
||||
|
||||
def add_profile(
|
||||
self, profile # type: Any
|
||||
):
|
||||
# type: (...) -> None
|
||||
self.add_item(Item(payload=PayloadRef(json=profile), type="profile"))
|
||||
|
||||
def add_profile_chunk(
|
||||
self, profile_chunk # type: Any
|
||||
):
|
||||
# type: (...) -> None
|
||||
self.add_item(
|
||||
Item(payload=PayloadRef(json=profile_chunk), type="profile_chunk")
|
||||
)
|
||||
|
||||
def add_checkin(
|
||||
self, checkin # type: Any
|
||||
):
|
||||
# type: (...) -> None
|
||||
self.add_item(Item(payload=PayloadRef(json=checkin), type="check_in"))
|
||||
|
||||
def add_session(
|
||||
self, session # type: Union[Session, Any]
|
||||
):
|
||||
# type: (...) -> None
|
||||
if isinstance(session, Session):
|
||||
session = session.to_json()
|
||||
self.add_item(Item(payload=PayloadRef(json=session), type="session"))
|
||||
|
||||
def add_sessions(
|
||||
self, sessions # type: Any
|
||||
):
|
||||
# type: (...) -> None
|
||||
self.add_item(Item(payload=PayloadRef(json=sessions), type="sessions"))
|
||||
|
||||
def add_item(
|
||||
self, item # type: Item
|
||||
):
|
||||
# type: (...) -> None
|
||||
self.items.append(item)
|
||||
|
||||
def get_event(self):
|
||||
# type: (...) -> Optional[Event]
|
||||
for items in self.items:
|
||||
event = items.get_event()
|
||||
if event is not None:
|
||||
return event
|
||||
return None
|
||||
|
||||
def get_transaction_event(self):
|
||||
# type: (...) -> Optional[Event]
|
||||
for item in self.items:
|
||||
event = item.get_transaction_event()
|
||||
if event is not None:
|
||||
return event
|
||||
return None
|
||||
|
||||
def __iter__(self):
|
||||
# type: (...) -> Iterator[Item]
|
||||
return iter(self.items)
|
||||
|
||||
def serialize_into(
|
||||
self, f # type: Any
|
||||
):
|
||||
# type: (...) -> None
|
||||
f.write(json_dumps(self.headers))
|
||||
f.write(b"\n")
|
||||
for item in self.items:
|
||||
item.serialize_into(f)
|
||||
|
||||
def serialize(self):
|
||||
# type: (...) -> bytes
|
||||
out = io.BytesIO()
|
||||
self.serialize_into(out)
|
||||
return out.getvalue()
|
||||
|
||||
@classmethod
|
||||
def deserialize_from(
|
||||
cls, f # type: Any
|
||||
):
|
||||
# type: (...) -> Envelope
|
||||
headers = parse_json(f.readline())
|
||||
items = []
|
||||
while 1:
|
||||
item = Item.deserialize_from(f)
|
||||
if item is None:
|
||||
break
|
||||
items.append(item)
|
||||
return cls(headers=headers, items=items)
|
||||
|
||||
@classmethod
|
||||
def deserialize(
|
||||
cls, bytes # type: bytes
|
||||
):
|
||||
# type: (...) -> Envelope
|
||||
return cls.deserialize_from(io.BytesIO(bytes))
|
||||
|
||||
def __repr__(self):
|
||||
# type: (...) -> str
|
||||
return "<Envelope headers=%r items=%r>" % (self.headers, self.items)
|
||||
|
||||
|
||||
class PayloadRef:
|
||||
def __init__(
|
||||
self,
|
||||
bytes=None, # type: Optional[bytes]
|
||||
path=None, # type: Optional[Union[bytes, str]]
|
||||
json=None, # type: Optional[Any]
|
||||
):
|
||||
# type: (...) -> None
|
||||
self.json = json
|
||||
self.bytes = bytes
|
||||
self.path = path
|
||||
|
||||
def get_bytes(self):
|
||||
# type: (...) -> bytes
|
||||
if self.bytes is None:
|
||||
if self.path is not None:
|
||||
with capture_internal_exceptions():
|
||||
with open(self.path, "rb") as f:
|
||||
self.bytes = f.read()
|
||||
elif self.json is not None:
|
||||
self.bytes = json_dumps(self.json)
|
||||
return self.bytes or b""
|
||||
|
||||
@property
|
||||
def inferred_content_type(self):
|
||||
# type: (...) -> str
|
||||
if self.json is not None:
|
||||
return "application/json"
|
||||
elif self.path is not None:
|
||||
path = self.path
|
||||
if isinstance(path, bytes):
|
||||
path = path.decode("utf-8", "replace")
|
||||
ty = mimetypes.guess_type(path)[0]
|
||||
if ty:
|
||||
return ty
|
||||
return "application/octet-stream"
|
||||
|
||||
def __repr__(self):
|
||||
# type: (...) -> str
|
||||
return "<Payload %r>" % (self.inferred_content_type,)
|
||||
|
||||
|
||||
class Item:
|
||||
def __init__(
|
||||
self,
|
||||
payload, # type: Union[bytes, str, PayloadRef]
|
||||
headers=None, # type: Optional[Dict[str, Any]]
|
||||
type=None, # type: Optional[str]
|
||||
content_type=None, # type: Optional[str]
|
||||
filename=None, # type: Optional[str]
|
||||
):
|
||||
if headers is not None:
|
||||
headers = dict(headers)
|
||||
elif headers is None:
|
||||
headers = {}
|
||||
self.headers = headers
|
||||
if isinstance(payload, bytes):
|
||||
payload = PayloadRef(bytes=payload)
|
||||
elif isinstance(payload, str):
|
||||
payload = PayloadRef(bytes=payload.encode("utf-8"))
|
||||
else:
|
||||
payload = payload
|
||||
|
||||
if filename is not None:
|
||||
headers["filename"] = filename
|
||||
if type is not None:
|
||||
headers["type"] = type
|
||||
if content_type is not None:
|
||||
headers["content_type"] = content_type
|
||||
elif "content_type" not in headers:
|
||||
headers["content_type"] = payload.inferred_content_type
|
||||
|
||||
self.payload = payload
|
||||
|
||||
def __repr__(self):
|
||||
# type: (...) -> str
|
||||
return "<Item headers=%r payload=%r data_category=%r>" % (
|
||||
self.headers,
|
||||
self.payload,
|
||||
self.data_category,
|
||||
)
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
# type: (...) -> Optional[str]
|
||||
return self.headers.get("type")
|
||||
|
||||
@property
|
||||
def data_category(self):
|
||||
# type: (...) -> EventDataCategory
|
||||
ty = self.headers.get("type")
|
||||
if ty == "session" or ty == "sessions":
|
||||
return "session"
|
||||
elif ty == "attachment":
|
||||
return "attachment"
|
||||
elif ty == "transaction":
|
||||
return "transaction"
|
||||
elif ty == "event":
|
||||
return "error"
|
||||
elif ty == "client_report":
|
||||
return "internal"
|
||||
elif ty == "profile":
|
||||
return "profile"
|
||||
elif ty == "profile_chunk":
|
||||
return "profile_chunk"
|
||||
elif ty == "statsd":
|
||||
return "metric_bucket"
|
||||
elif ty == "check_in":
|
||||
return "monitor"
|
||||
else:
|
||||
return "default"
|
||||
|
||||
def get_bytes(self):
|
||||
# type: (...) -> bytes
|
||||
return self.payload.get_bytes()
|
||||
|
||||
def get_event(self):
|
||||
# type: (...) -> Optional[Event]
|
||||
"""
|
||||
Returns an error event if there is one.
|
||||
"""
|
||||
if self.type == "event" and self.payload.json is not None:
|
||||
return self.payload.json
|
||||
return None
|
||||
|
||||
def get_transaction_event(self):
|
||||
# type: (...) -> Optional[Event]
|
||||
if self.type == "transaction" and self.payload.json is not None:
|
||||
return self.payload.json
|
||||
return None
|
||||
|
||||
def serialize_into(
|
||||
self, f # type: Any
|
||||
):
|
||||
# type: (...) -> None
|
||||
headers = dict(self.headers)
|
||||
bytes = self.get_bytes()
|
||||
headers["length"] = len(bytes)
|
||||
f.write(json_dumps(headers))
|
||||
f.write(b"\n")
|
||||
f.write(bytes)
|
||||
f.write(b"\n")
|
||||
|
||||
def serialize(self):
|
||||
# type: (...) -> bytes
|
||||
out = io.BytesIO()
|
||||
self.serialize_into(out)
|
||||
return out.getvalue()
|
||||
|
||||
@classmethod
|
||||
def deserialize_from(
|
||||
cls, f # type: Any
|
||||
):
|
||||
# type: (...) -> Optional[Item]
|
||||
line = f.readline().rstrip()
|
||||
if not line:
|
||||
return None
|
||||
headers = parse_json(line)
|
||||
length = headers.get("length")
|
||||
if length is not None:
|
||||
payload = f.read(length)
|
||||
f.readline()
|
||||
else:
|
||||
# if no length was specified we need to read up to the end of line
|
||||
# and remove it (if it is present, i.e. not the very last char in an eof terminated envelope)
|
||||
payload = f.readline().rstrip(b"\n")
|
||||
if headers.get("type") in ("event", "transaction", "metric_buckets"):
|
||||
rv = cls(headers=headers, payload=PayloadRef(json=parse_json(payload)))
|
||||
else:
|
||||
rv = cls(headers=headers, payload=payload)
|
||||
return rv
|
||||
|
||||
@classmethod
|
||||
def deserialize(
|
||||
cls, bytes # type: bytes
|
||||
):
|
||||
# type: (...) -> Optional[Item]
|
||||
return cls.deserialize_from(io.BytesIO(bytes))
|
||||
@@ -0,0 +1,68 @@
|
||||
import copy
|
||||
import sentry_sdk
|
||||
from sentry_sdk._lru_cache import LRUCache
|
||||
from threading import Lock
|
||||
|
||||
from typing import TYPE_CHECKING, Any
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import TypedDict
|
||||
|
||||
FlagData = TypedDict("FlagData", {"flag": str, "result": bool})
|
||||
|
||||
|
||||
DEFAULT_FLAG_CAPACITY = 100
|
||||
|
||||
|
||||
class FlagBuffer:
|
||||
|
||||
def __init__(self, capacity):
|
||||
# type: (int) -> None
|
||||
self.capacity = capacity
|
||||
self.lock = Lock()
|
||||
|
||||
# Buffer is private. The name is mangled to discourage use. If you use this attribute
|
||||
# directly you're on your own!
|
||||
self.__buffer = LRUCache(capacity)
|
||||
|
||||
def clear(self):
|
||||
# type: () -> None
|
||||
self.__buffer = LRUCache(self.capacity)
|
||||
|
||||
def __deepcopy__(self, memo):
|
||||
# type: (dict[int, Any]) -> FlagBuffer
|
||||
with self.lock:
|
||||
buffer = FlagBuffer(self.capacity)
|
||||
buffer.__buffer = copy.deepcopy(self.__buffer, memo)
|
||||
return buffer
|
||||
|
||||
def get(self):
|
||||
# type: () -> list[FlagData]
|
||||
with self.lock:
|
||||
return [
|
||||
{"flag": key, "result": value} for key, value in self.__buffer.get_all()
|
||||
]
|
||||
|
||||
def set(self, flag, result):
|
||||
# type: (str, bool) -> None
|
||||
if isinstance(result, FlagBuffer):
|
||||
# If someone were to insert `self` into `self` this would create a circular dependency
|
||||
# on the lock. This is of course a deadlock. However, this is far outside the expected
|
||||
# usage of this class. We guard against it here for completeness and to document this
|
||||
# expected failure mode.
|
||||
raise ValueError(
|
||||
"FlagBuffer instances can not be inserted into the dictionary."
|
||||
)
|
||||
|
||||
with self.lock:
|
||||
self.__buffer.set(flag, result)
|
||||
|
||||
|
||||
def add_feature_flag(flag, result):
|
||||
# type: (str, bool) -> None
|
||||
"""
|
||||
Records a flag and its value to be sent on subsequent error events.
|
||||
We recommend you do this on flag evaluations. Flags are buffered per Sentry scope.
|
||||
"""
|
||||
flags = sentry_sdk.get_current_scope().flags
|
||||
flags.set(flag, result)
|
||||
@@ -0,0 +1,739 @@
|
||||
import warnings
|
||||
from contextlib import contextmanager
|
||||
|
||||
from sentry_sdk import (
|
||||
get_client,
|
||||
get_global_scope,
|
||||
get_isolation_scope,
|
||||
get_current_scope,
|
||||
)
|
||||
from sentry_sdk._compat import with_metaclass
|
||||
from sentry_sdk.consts import INSTRUMENTER
|
||||
from sentry_sdk.scope import _ScopeManager
|
||||
from sentry_sdk.client import Client
|
||||
from sentry_sdk.tracing import (
|
||||
NoOpSpan,
|
||||
Span,
|
||||
Transaction,
|
||||
)
|
||||
|
||||
from sentry_sdk.utils import (
|
||||
logger,
|
||||
ContextVar,
|
||||
)
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import ContextManager
|
||||
from typing import Dict
|
||||
from typing import Generator
|
||||
from typing import List
|
||||
from typing import Optional
|
||||
from typing import overload
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
from typing import TypeVar
|
||||
from typing import Union
|
||||
|
||||
from typing_extensions import Unpack
|
||||
|
||||
from sentry_sdk.scope import Scope
|
||||
from sentry_sdk.client import BaseClient
|
||||
from sentry_sdk.integrations import Integration
|
||||
from sentry_sdk._types import (
|
||||
Event,
|
||||
Hint,
|
||||
Breadcrumb,
|
||||
BreadcrumbHint,
|
||||
ExcInfo,
|
||||
LogLevelStr,
|
||||
SamplingContext,
|
||||
)
|
||||
from sentry_sdk.tracing import TransactionKwargs
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
else:
|
||||
|
||||
def overload(x):
|
||||
# type: (T) -> T
|
||||
return x
|
||||
|
||||
|
||||
class SentryHubDeprecationWarning(DeprecationWarning):
|
||||
"""
|
||||
A custom deprecation warning to inform users that the Hub is deprecated.
|
||||
"""
|
||||
|
||||
_MESSAGE = (
|
||||
"`sentry_sdk.Hub` is deprecated and will be removed in a future major release. "
|
||||
"Please consult our 1.x to 2.x migration guide for details on how to migrate "
|
||||
"`Hub` usage to the new API: "
|
||||
"https://docs.sentry.io/platforms/python/migration/1.x-to-2.x"
|
||||
)
|
||||
|
||||
def __init__(self, *_):
|
||||
# type: (*object) -> None
|
||||
super().__init__(self._MESSAGE)
|
||||
|
||||
|
||||
@contextmanager
|
||||
def _suppress_hub_deprecation_warning():
|
||||
# type: () -> Generator[None, None, None]
|
||||
"""Utility function to suppress deprecation warnings for the Hub."""
|
||||
with warnings.catch_warnings():
|
||||
warnings.filterwarnings("ignore", category=SentryHubDeprecationWarning)
|
||||
yield
|
||||
|
||||
|
||||
_local = ContextVar("sentry_current_hub")
|
||||
|
||||
|
||||
class HubMeta(type):
|
||||
@property
|
||||
def current(cls):
|
||||
# type: () -> Hub
|
||||
"""Returns the current instance of the hub."""
|
||||
warnings.warn(SentryHubDeprecationWarning(), stacklevel=2)
|
||||
rv = _local.get(None)
|
||||
if rv is None:
|
||||
with _suppress_hub_deprecation_warning():
|
||||
# This will raise a deprecation warning; suppress it since we already warned above.
|
||||
rv = Hub(GLOBAL_HUB)
|
||||
_local.set(rv)
|
||||
return rv
|
||||
|
||||
@property
|
||||
def main(cls):
|
||||
# type: () -> Hub
|
||||
"""Returns the main instance of the hub."""
|
||||
warnings.warn(SentryHubDeprecationWarning(), stacklevel=2)
|
||||
return GLOBAL_HUB
|
||||
|
||||
|
||||
class Hub(with_metaclass(HubMeta)): # type: ignore
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
The Hub is deprecated. Its functionality will be merged into :py:class:`sentry_sdk.scope.Scope`.
|
||||
|
||||
The hub wraps the concurrency management of the SDK. Each thread has
|
||||
its own hub but the hub might transfer with the flow of execution if
|
||||
context vars are available.
|
||||
|
||||
If the hub is used with a with statement it's temporarily activated.
|
||||
"""
|
||||
|
||||
_stack = None # type: List[Tuple[Optional[Client], Scope]]
|
||||
_scope = None # type: Optional[Scope]
|
||||
|
||||
# Mypy doesn't pick up on the metaclass.
|
||||
|
||||
if TYPE_CHECKING:
|
||||
current = None # type: Hub
|
||||
main = None # type: Hub
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
client_or_hub=None, # type: Optional[Union[Hub, Client]]
|
||||
scope=None, # type: Optional[Any]
|
||||
):
|
||||
# type: (...) -> None
|
||||
warnings.warn(SentryHubDeprecationWarning(), stacklevel=2)
|
||||
|
||||
current_scope = None
|
||||
|
||||
if isinstance(client_or_hub, Hub):
|
||||
client = get_client()
|
||||
if scope is None:
|
||||
# hub cloning is going on, we use a fork of the current/isolation scope for context manager
|
||||
scope = get_isolation_scope().fork()
|
||||
current_scope = get_current_scope().fork()
|
||||
else:
|
||||
client = client_or_hub # type: ignore
|
||||
get_global_scope().set_client(client)
|
||||
|
||||
if scope is None: # so there is no Hub cloning going on
|
||||
# just the current isolation scope is used for context manager
|
||||
scope = get_isolation_scope()
|
||||
current_scope = get_current_scope()
|
||||
|
||||
if current_scope is None:
|
||||
# just the current current scope is used for context manager
|
||||
current_scope = get_current_scope()
|
||||
|
||||
self._stack = [(client, scope)] # type: ignore
|
||||
self._last_event_id = None # type: Optional[str]
|
||||
self._old_hubs = [] # type: List[Hub]
|
||||
|
||||
self._old_current_scopes = [] # type: List[Scope]
|
||||
self._old_isolation_scopes = [] # type: List[Scope]
|
||||
self._current_scope = current_scope # type: Scope
|
||||
self._scope = scope # type: Scope
|
||||
|
||||
def __enter__(self):
|
||||
# type: () -> Hub
|
||||
self._old_hubs.append(Hub.current)
|
||||
_local.set(self)
|
||||
|
||||
current_scope = get_current_scope()
|
||||
self._old_current_scopes.append(current_scope)
|
||||
scope._current_scope.set(self._current_scope)
|
||||
|
||||
isolation_scope = get_isolation_scope()
|
||||
self._old_isolation_scopes.append(isolation_scope)
|
||||
scope._isolation_scope.set(self._scope)
|
||||
|
||||
return self
|
||||
|
||||
def __exit__(
|
||||
self,
|
||||
exc_type, # type: Optional[type]
|
||||
exc_value, # type: Optional[BaseException]
|
||||
tb, # type: Optional[Any]
|
||||
):
|
||||
# type: (...) -> None
|
||||
old = self._old_hubs.pop()
|
||||
_local.set(old)
|
||||
|
||||
old_current_scope = self._old_current_scopes.pop()
|
||||
scope._current_scope.set(old_current_scope)
|
||||
|
||||
old_isolation_scope = self._old_isolation_scopes.pop()
|
||||
scope._isolation_scope.set(old_isolation_scope)
|
||||
|
||||
def run(
|
||||
self, callback # type: Callable[[], T]
|
||||
):
|
||||
# type: (...) -> T
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
|
||||
Runs a callback in the context of the hub. Alternatively the
|
||||
with statement can be used on the hub directly.
|
||||
"""
|
||||
with self:
|
||||
return callback()
|
||||
|
||||
def get_integration(
|
||||
self, name_or_class # type: Union[str, Type[Integration]]
|
||||
):
|
||||
# type: (...) -> Any
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.client._Client.get_integration` instead.
|
||||
|
||||
Returns the integration for this hub by name or class. If there
|
||||
is no client bound or the client does not have that integration
|
||||
then `None` is returned.
|
||||
|
||||
If the return value is not `None` the hub is guaranteed to have a
|
||||
client attached.
|
||||
"""
|
||||
return get_client().get_integration(name_or_class)
|
||||
|
||||
@property
|
||||
def client(self):
|
||||
# type: () -> Optional[BaseClient]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This property is deprecated and will be removed in a future release.
|
||||
Please use :py:func:`sentry_sdk.api.get_client` instead.
|
||||
|
||||
Returns the current client on the hub.
|
||||
"""
|
||||
client = get_client()
|
||||
|
||||
if not client.is_active():
|
||||
return None
|
||||
|
||||
return client
|
||||
|
||||
@property
|
||||
def scope(self):
|
||||
# type: () -> Scope
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This property is deprecated and will be removed in a future release.
|
||||
Returns the current scope on the hub.
|
||||
"""
|
||||
return get_isolation_scope()
|
||||
|
||||
def last_event_id(self):
|
||||
# type: () -> Optional[str]
|
||||
"""
|
||||
Returns the last event ID.
|
||||
|
||||
.. deprecated:: 1.40.5
|
||||
This function is deprecated and will be removed in a future release. The functions `capture_event`, `capture_message`, and `capture_exception` return the event ID directly.
|
||||
"""
|
||||
logger.warning(
|
||||
"Deprecated: last_event_id is deprecated. This will be removed in the future. The functions `capture_event`, `capture_message`, and `capture_exception` return the event ID directly."
|
||||
)
|
||||
return self._last_event_id
|
||||
|
||||
def bind_client(
|
||||
self, new # type: Optional[BaseClient]
|
||||
):
|
||||
# type: (...) -> None
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.set_client` instead.
|
||||
|
||||
Binds a new client to the hub.
|
||||
"""
|
||||
get_global_scope().set_client(new)
|
||||
|
||||
def capture_event(self, event, hint=None, scope=None, **scope_kwargs):
|
||||
# type: (Event, Optional[Hint], Optional[Scope], Any) -> Optional[str]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.capture_event` instead.
|
||||
|
||||
Captures an event.
|
||||
|
||||
Alias of :py:meth:`sentry_sdk.Scope.capture_event`.
|
||||
|
||||
:param event: A ready-made event that can be directly sent to Sentry.
|
||||
|
||||
:param hint: Contains metadata about the event that can be read from `before_send`, such as the original exception object or a HTTP request object.
|
||||
|
||||
:param scope: An optional :py:class:`sentry_sdk.Scope` to apply to events.
|
||||
The `scope` and `scope_kwargs` parameters are mutually exclusive.
|
||||
|
||||
:param scope_kwargs: Optional data to apply to event.
|
||||
For supported `**scope_kwargs` see :py:meth:`sentry_sdk.Scope.update_from_kwargs`.
|
||||
The `scope` and `scope_kwargs` parameters are mutually exclusive.
|
||||
"""
|
||||
last_event_id = get_current_scope().capture_event(
|
||||
event, hint, scope=scope, **scope_kwargs
|
||||
)
|
||||
|
||||
is_transaction = event.get("type") == "transaction"
|
||||
if last_event_id is not None and not is_transaction:
|
||||
self._last_event_id = last_event_id
|
||||
|
||||
return last_event_id
|
||||
|
||||
def capture_message(self, message, level=None, scope=None, **scope_kwargs):
|
||||
# type: (str, Optional[LogLevelStr], Optional[Scope], Any) -> Optional[str]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.capture_message` instead.
|
||||
|
||||
Captures a message.
|
||||
|
||||
Alias of :py:meth:`sentry_sdk.Scope.capture_message`.
|
||||
|
||||
:param message: The string to send as the message to Sentry.
|
||||
|
||||
:param level: If no level is provided, the default level is `info`.
|
||||
|
||||
:param scope: An optional :py:class:`sentry_sdk.Scope` to apply to events.
|
||||
The `scope` and `scope_kwargs` parameters are mutually exclusive.
|
||||
|
||||
:param scope_kwargs: Optional data to apply to event.
|
||||
For supported `**scope_kwargs` see :py:meth:`sentry_sdk.Scope.update_from_kwargs`.
|
||||
The `scope` and `scope_kwargs` parameters are mutually exclusive.
|
||||
|
||||
:returns: An `event_id` if the SDK decided to send the event (see :py:meth:`sentry_sdk.client._Client.capture_event`).
|
||||
"""
|
||||
last_event_id = get_current_scope().capture_message(
|
||||
message, level=level, scope=scope, **scope_kwargs
|
||||
)
|
||||
|
||||
if last_event_id is not None:
|
||||
self._last_event_id = last_event_id
|
||||
|
||||
return last_event_id
|
||||
|
||||
def capture_exception(self, error=None, scope=None, **scope_kwargs):
|
||||
# type: (Optional[Union[BaseException, ExcInfo]], Optional[Scope], Any) -> Optional[str]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.capture_exception` instead.
|
||||
|
||||
Captures an exception.
|
||||
|
||||
Alias of :py:meth:`sentry_sdk.Scope.capture_exception`.
|
||||
|
||||
:param error: An exception to capture. If `None`, `sys.exc_info()` will be used.
|
||||
|
||||
:param scope: An optional :py:class:`sentry_sdk.Scope` to apply to events.
|
||||
The `scope` and `scope_kwargs` parameters are mutually exclusive.
|
||||
|
||||
:param scope_kwargs: Optional data to apply to event.
|
||||
For supported `**scope_kwargs` see :py:meth:`sentry_sdk.Scope.update_from_kwargs`.
|
||||
The `scope` and `scope_kwargs` parameters are mutually exclusive.
|
||||
|
||||
:returns: An `event_id` if the SDK decided to send the event (see :py:meth:`sentry_sdk.client._Client.capture_event`).
|
||||
"""
|
||||
last_event_id = get_current_scope().capture_exception(
|
||||
error, scope=scope, **scope_kwargs
|
||||
)
|
||||
|
||||
if last_event_id is not None:
|
||||
self._last_event_id = last_event_id
|
||||
|
||||
return last_event_id
|
||||
|
||||
def add_breadcrumb(self, crumb=None, hint=None, **kwargs):
|
||||
# type: (Optional[Breadcrumb], Optional[BreadcrumbHint], Any) -> None
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.add_breadcrumb` instead.
|
||||
|
||||
Adds a breadcrumb.
|
||||
|
||||
:param crumb: Dictionary with the data as the sentry v7/v8 protocol expects.
|
||||
|
||||
:param hint: An optional value that can be used by `before_breadcrumb`
|
||||
to customize the breadcrumbs that are emitted.
|
||||
"""
|
||||
get_isolation_scope().add_breadcrumb(crumb, hint, **kwargs)
|
||||
|
||||
def start_span(self, instrumenter=INSTRUMENTER.SENTRY, **kwargs):
|
||||
# type: (str, Any) -> Span
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.start_span` instead.
|
||||
|
||||
Start a span whose parent is the currently active span or transaction, if any.
|
||||
|
||||
The return value is a :py:class:`sentry_sdk.tracing.Span` instance,
|
||||
typically used as a context manager to start and stop timing in a `with`
|
||||
block.
|
||||
|
||||
Only spans contained in a transaction are sent to Sentry. Most
|
||||
integrations start a transaction at the appropriate time, for example
|
||||
for every incoming HTTP request. Use
|
||||
:py:meth:`sentry_sdk.start_transaction` to start a new transaction when
|
||||
one is not already in progress.
|
||||
|
||||
For supported `**kwargs` see :py:class:`sentry_sdk.tracing.Span`.
|
||||
"""
|
||||
scope = get_current_scope()
|
||||
return scope.start_span(instrumenter=instrumenter, **kwargs)
|
||||
|
||||
def start_transaction(
|
||||
self,
|
||||
transaction=None,
|
||||
instrumenter=INSTRUMENTER.SENTRY,
|
||||
custom_sampling_context=None,
|
||||
**kwargs
|
||||
):
|
||||
# type: (Optional[Transaction], str, Optional[SamplingContext], Unpack[TransactionKwargs]) -> Union[Transaction, NoOpSpan]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.start_transaction` instead.
|
||||
|
||||
Start and return a transaction.
|
||||
|
||||
Start an existing transaction if given, otherwise create and start a new
|
||||
transaction with kwargs.
|
||||
|
||||
This is the entry point to manual tracing instrumentation.
|
||||
|
||||
A tree structure can be built by adding child spans to the transaction,
|
||||
and child spans to other spans. To start a new child span within the
|
||||
transaction or any span, call the respective `.start_child()` method.
|
||||
|
||||
Every child span must be finished before the transaction is finished,
|
||||
otherwise the unfinished spans are discarded.
|
||||
|
||||
When used as context managers, spans and transactions are automatically
|
||||
finished at the end of the `with` block. If not using context managers,
|
||||
call the `.finish()` method.
|
||||
|
||||
When the transaction is finished, it will be sent to Sentry with all its
|
||||
finished child spans.
|
||||
|
||||
For supported `**kwargs` see :py:class:`sentry_sdk.tracing.Transaction`.
|
||||
"""
|
||||
scope = get_current_scope()
|
||||
|
||||
# For backwards compatibility, we allow passing the scope as the hub.
|
||||
# We need a major release to make this nice. (if someone searches the code: deprecated)
|
||||
# Type checking disabled for this line because deprecated keys are not allowed in the type signature.
|
||||
kwargs["hub"] = scope # type: ignore
|
||||
|
||||
return scope.start_transaction(
|
||||
transaction, instrumenter, custom_sampling_context, **kwargs
|
||||
)
|
||||
|
||||
def continue_trace(self, environ_or_headers, op=None, name=None, source=None):
|
||||
# type: (Dict[str, Any], Optional[str], Optional[str], Optional[str]) -> Transaction
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.continue_trace` instead.
|
||||
|
||||
Sets the propagation context from environment or headers and returns a transaction.
|
||||
"""
|
||||
return get_isolation_scope().continue_trace(
|
||||
environ_or_headers=environ_or_headers, op=op, name=name, source=source
|
||||
)
|
||||
|
||||
@overload
|
||||
def push_scope(
|
||||
self, callback=None # type: Optional[None]
|
||||
):
|
||||
# type: (...) -> ContextManager[Scope]
|
||||
pass
|
||||
|
||||
@overload
|
||||
def push_scope( # noqa: F811
|
||||
self, callback # type: Callable[[Scope], None]
|
||||
):
|
||||
# type: (...) -> None
|
||||
pass
|
||||
|
||||
def push_scope( # noqa
|
||||
self,
|
||||
callback=None, # type: Optional[Callable[[Scope], None]]
|
||||
continue_trace=True, # type: bool
|
||||
):
|
||||
# type: (...) -> Optional[ContextManager[Scope]]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
|
||||
Pushes a new layer on the scope stack.
|
||||
|
||||
:param callback: If provided, this method pushes a scope, calls
|
||||
`callback`, and pops the scope again.
|
||||
|
||||
:returns: If no `callback` is provided, a context manager that should
|
||||
be used to pop the scope again.
|
||||
"""
|
||||
if callback is not None:
|
||||
with self.push_scope() as scope:
|
||||
callback(scope)
|
||||
return None
|
||||
|
||||
return _ScopeManager(self)
|
||||
|
||||
def pop_scope_unsafe(self):
|
||||
# type: () -> Tuple[Optional[Client], Scope]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
|
||||
Pops a scope layer from the stack.
|
||||
|
||||
Try to use the context manager :py:meth:`push_scope` instead.
|
||||
"""
|
||||
rv = self._stack.pop()
|
||||
assert self._stack, "stack must have at least one layer"
|
||||
return rv
|
||||
|
||||
@overload
|
||||
def configure_scope(
|
||||
self, callback=None # type: Optional[None]
|
||||
):
|
||||
# type: (...) -> ContextManager[Scope]
|
||||
pass
|
||||
|
||||
@overload
|
||||
def configure_scope( # noqa: F811
|
||||
self, callback # type: Callable[[Scope], None]
|
||||
):
|
||||
# type: (...) -> None
|
||||
pass
|
||||
|
||||
def configure_scope( # noqa
|
||||
self,
|
||||
callback=None, # type: Optional[Callable[[Scope], None]]
|
||||
continue_trace=True, # type: bool
|
||||
):
|
||||
# type: (...) -> Optional[ContextManager[Scope]]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
|
||||
Reconfigures the scope.
|
||||
|
||||
:param callback: If provided, call the callback with the current scope.
|
||||
|
||||
:returns: If no callback is provided, returns a context manager that returns the scope.
|
||||
"""
|
||||
scope = get_isolation_scope()
|
||||
|
||||
if continue_trace:
|
||||
scope.generate_propagation_context()
|
||||
|
||||
if callback is not None:
|
||||
# TODO: used to return None when client is None. Check if this changes behavior.
|
||||
callback(scope)
|
||||
|
||||
return None
|
||||
|
||||
@contextmanager
|
||||
def inner():
|
||||
# type: () -> Generator[Scope, None, None]
|
||||
yield scope
|
||||
|
||||
return inner()
|
||||
|
||||
def start_session(
|
||||
self, session_mode="application" # type: str
|
||||
):
|
||||
# type: (...) -> None
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.start_session` instead.
|
||||
|
||||
Starts a new session.
|
||||
"""
|
||||
get_isolation_scope().start_session(
|
||||
session_mode=session_mode,
|
||||
)
|
||||
|
||||
def end_session(self):
|
||||
# type: (...) -> None
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.end_session` instead.
|
||||
|
||||
Ends the current session if there is one.
|
||||
"""
|
||||
get_isolation_scope().end_session()
|
||||
|
||||
def stop_auto_session_tracking(self):
|
||||
# type: (...) -> None
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.stop_auto_session_tracking` instead.
|
||||
|
||||
Stops automatic session tracking.
|
||||
|
||||
This temporarily session tracking for the current scope when called.
|
||||
To resume session tracking call `resume_auto_session_tracking`.
|
||||
"""
|
||||
get_isolation_scope().stop_auto_session_tracking()
|
||||
|
||||
def resume_auto_session_tracking(self):
|
||||
# type: (...) -> None
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.resume_auto_session_tracking` instead.
|
||||
|
||||
Resumes automatic session tracking for the current scope if
|
||||
disabled earlier. This requires that generally automatic session
|
||||
tracking is enabled.
|
||||
"""
|
||||
get_isolation_scope().resume_auto_session_tracking()
|
||||
|
||||
def flush(
|
||||
self,
|
||||
timeout=None, # type: Optional[float]
|
||||
callback=None, # type: Optional[Callable[[int, float], None]]
|
||||
):
|
||||
# type: (...) -> None
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.client._Client.flush` instead.
|
||||
|
||||
Alias for :py:meth:`sentry_sdk.client._Client.flush`
|
||||
"""
|
||||
return get_client().flush(timeout=timeout, callback=callback)
|
||||
|
||||
def get_traceparent(self):
|
||||
# type: () -> Optional[str]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.get_traceparent` instead.
|
||||
|
||||
Returns the traceparent either from the active span or from the scope.
|
||||
"""
|
||||
current_scope = get_current_scope()
|
||||
traceparent = current_scope.get_traceparent()
|
||||
|
||||
if traceparent is None:
|
||||
isolation_scope = get_isolation_scope()
|
||||
traceparent = isolation_scope.get_traceparent()
|
||||
|
||||
return traceparent
|
||||
|
||||
def get_baggage(self):
|
||||
# type: () -> Optional[str]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.get_baggage` instead.
|
||||
|
||||
Returns Baggage either from the active span or from the scope.
|
||||
"""
|
||||
current_scope = get_current_scope()
|
||||
baggage = current_scope.get_baggage()
|
||||
|
||||
if baggage is None:
|
||||
isolation_scope = get_isolation_scope()
|
||||
baggage = isolation_scope.get_baggage()
|
||||
|
||||
if baggage is not None:
|
||||
return baggage.serialize()
|
||||
|
||||
return None
|
||||
|
||||
def iter_trace_propagation_headers(self, span=None):
|
||||
# type: (Optional[Span]) -> Generator[Tuple[str, str], None, None]
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.iter_trace_propagation_headers` instead.
|
||||
|
||||
Return HTTP headers which allow propagation of trace data. Data taken
|
||||
from the span representing the request, if available, or the current
|
||||
span on the scope if not.
|
||||
"""
|
||||
return get_current_scope().iter_trace_propagation_headers(
|
||||
span=span,
|
||||
)
|
||||
|
||||
def trace_propagation_meta(self, span=None):
|
||||
# type: (Optional[Span]) -> str
|
||||
"""
|
||||
.. deprecated:: 2.0.0
|
||||
This function is deprecated and will be removed in a future release.
|
||||
Please use :py:meth:`sentry_sdk.Scope.trace_propagation_meta` instead.
|
||||
|
||||
Return meta tags which should be injected into HTML templates
|
||||
to allow propagation of trace information.
|
||||
"""
|
||||
if span is not None:
|
||||
logger.warning(
|
||||
"The parameter `span` in trace_propagation_meta() is deprecated and will be removed in the future."
|
||||
)
|
||||
|
||||
return get_current_scope().trace_propagation_meta(
|
||||
span=span,
|
||||
)
|
||||
|
||||
|
||||
with _suppress_hub_deprecation_warning():
|
||||
# Suppress deprecation warning for the Hub here, since we still always
|
||||
# import this module.
|
||||
GLOBAL_HUB = Hub()
|
||||
_local.set(GLOBAL_HUB)
|
||||
|
||||
|
||||
# Circular imports
|
||||
from sentry_sdk import scope
|
||||
@@ -0,0 +1,293 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from threading import Lock
|
||||
|
||||
from sentry_sdk.utils import logger
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import Sequence
|
||||
from typing import Callable
|
||||
from typing import Dict
|
||||
from typing import Iterator
|
||||
from typing import List
|
||||
from typing import Optional
|
||||
from typing import Set
|
||||
from typing import Type
|
||||
from typing import Union
|
||||
|
||||
|
||||
_DEFAULT_FAILED_REQUEST_STATUS_CODES = frozenset(range(500, 600))
|
||||
|
||||
|
||||
_installer_lock = Lock()
|
||||
|
||||
# Set of all integration identifiers we have attempted to install
|
||||
_processed_integrations = set() # type: Set[str]
|
||||
|
||||
# Set of all integration identifiers we have actually installed
|
||||
_installed_integrations = set() # type: Set[str]
|
||||
|
||||
|
||||
def _generate_default_integrations_iterator(
|
||||
integrations, # type: List[str]
|
||||
auto_enabling_integrations, # type: List[str]
|
||||
):
|
||||
# type: (...) -> Callable[[bool], Iterator[Type[Integration]]]
|
||||
|
||||
def iter_default_integrations(with_auto_enabling_integrations):
|
||||
# type: (bool) -> Iterator[Type[Integration]]
|
||||
"""Returns an iterator of the default integration classes:"""
|
||||
from importlib import import_module
|
||||
|
||||
if with_auto_enabling_integrations:
|
||||
all_import_strings = integrations + auto_enabling_integrations
|
||||
else:
|
||||
all_import_strings = integrations
|
||||
|
||||
for import_string in all_import_strings:
|
||||
try:
|
||||
module, cls = import_string.rsplit(".", 1)
|
||||
yield getattr(import_module(module), cls)
|
||||
except (DidNotEnable, SyntaxError) as e:
|
||||
logger.debug(
|
||||
"Did not import default integration %s: %s", import_string, e
|
||||
)
|
||||
|
||||
if isinstance(iter_default_integrations.__doc__, str):
|
||||
for import_string in integrations:
|
||||
iter_default_integrations.__doc__ += "\n- `{}`".format(import_string)
|
||||
|
||||
return iter_default_integrations
|
||||
|
||||
|
||||
_DEFAULT_INTEGRATIONS = [
|
||||
# stdlib/base runtime integrations
|
||||
"sentry_sdk.integrations.argv.ArgvIntegration",
|
||||
"sentry_sdk.integrations.atexit.AtexitIntegration",
|
||||
"sentry_sdk.integrations.dedupe.DedupeIntegration",
|
||||
"sentry_sdk.integrations.excepthook.ExcepthookIntegration",
|
||||
"sentry_sdk.integrations.logging.LoggingIntegration",
|
||||
"sentry_sdk.integrations.modules.ModulesIntegration",
|
||||
"sentry_sdk.integrations.stdlib.StdlibIntegration",
|
||||
"sentry_sdk.integrations.threading.ThreadingIntegration",
|
||||
]
|
||||
|
||||
_AUTO_ENABLING_INTEGRATIONS = [
|
||||
"sentry_sdk.integrations.aiohttp.AioHttpIntegration",
|
||||
"sentry_sdk.integrations.anthropic.AnthropicIntegration",
|
||||
"sentry_sdk.integrations.ariadne.AriadneIntegration",
|
||||
"sentry_sdk.integrations.arq.ArqIntegration",
|
||||
"sentry_sdk.integrations.asyncpg.AsyncPGIntegration",
|
||||
"sentry_sdk.integrations.boto3.Boto3Integration",
|
||||
"sentry_sdk.integrations.bottle.BottleIntegration",
|
||||
"sentry_sdk.integrations.celery.CeleryIntegration",
|
||||
"sentry_sdk.integrations.chalice.ChaliceIntegration",
|
||||
"sentry_sdk.integrations.clickhouse_driver.ClickhouseDriverIntegration",
|
||||
"sentry_sdk.integrations.cohere.CohereIntegration",
|
||||
"sentry_sdk.integrations.django.DjangoIntegration",
|
||||
"sentry_sdk.integrations.falcon.FalconIntegration",
|
||||
"sentry_sdk.integrations.fastapi.FastApiIntegration",
|
||||
"sentry_sdk.integrations.flask.FlaskIntegration",
|
||||
"sentry_sdk.integrations.gql.GQLIntegration",
|
||||
"sentry_sdk.integrations.graphene.GrapheneIntegration",
|
||||
"sentry_sdk.integrations.httpx.HttpxIntegration",
|
||||
"sentry_sdk.integrations.huey.HueyIntegration",
|
||||
"sentry_sdk.integrations.huggingface_hub.HuggingfaceHubIntegration",
|
||||
"sentry_sdk.integrations.langchain.LangchainIntegration",
|
||||
"sentry_sdk.integrations.litestar.LitestarIntegration",
|
||||
"sentry_sdk.integrations.loguru.LoguruIntegration",
|
||||
"sentry_sdk.integrations.openai.OpenAIIntegration",
|
||||
"sentry_sdk.integrations.pymongo.PyMongoIntegration",
|
||||
"sentry_sdk.integrations.pyramid.PyramidIntegration",
|
||||
"sentry_sdk.integrations.quart.QuartIntegration",
|
||||
"sentry_sdk.integrations.redis.RedisIntegration",
|
||||
"sentry_sdk.integrations.rq.RqIntegration",
|
||||
"sentry_sdk.integrations.sanic.SanicIntegration",
|
||||
"sentry_sdk.integrations.sqlalchemy.SqlalchemyIntegration",
|
||||
"sentry_sdk.integrations.starlette.StarletteIntegration",
|
||||
"sentry_sdk.integrations.starlite.StarliteIntegration",
|
||||
"sentry_sdk.integrations.strawberry.StrawberryIntegration",
|
||||
"sentry_sdk.integrations.tornado.TornadoIntegration",
|
||||
]
|
||||
|
||||
iter_default_integrations = _generate_default_integrations_iterator(
|
||||
integrations=_DEFAULT_INTEGRATIONS,
|
||||
auto_enabling_integrations=_AUTO_ENABLING_INTEGRATIONS,
|
||||
)
|
||||
|
||||
del _generate_default_integrations_iterator
|
||||
|
||||
|
||||
_MIN_VERSIONS = {
|
||||
"aiohttp": (3, 4),
|
||||
"anthropic": (0, 16),
|
||||
"ariadne": (0, 20),
|
||||
"arq": (0, 23),
|
||||
"asyncpg": (0, 23),
|
||||
"beam": (2, 12),
|
||||
"boto3": (1, 12), # botocore
|
||||
"bottle": (0, 12),
|
||||
"celery": (4, 4, 7),
|
||||
"chalice": (1, 16, 0),
|
||||
"clickhouse_driver": (0, 2, 0),
|
||||
"django": (1, 8),
|
||||
"dramatiq": (1, 9),
|
||||
"falcon": (1, 4),
|
||||
"fastapi": (0, 79, 0),
|
||||
"flask": (1, 1, 4),
|
||||
"gql": (3, 4, 1),
|
||||
"graphene": (3, 3),
|
||||
"grpc": (1, 32, 0), # grpcio
|
||||
"huggingface_hub": (0, 22),
|
||||
"langchain": (0, 0, 210),
|
||||
"launchdarkly": (9, 8, 0),
|
||||
"loguru": (0, 7, 0),
|
||||
"openai": (1, 0, 0),
|
||||
"openfeature": (0, 7, 1),
|
||||
"quart": (0, 16, 0),
|
||||
"ray": (2, 7, 0),
|
||||
"requests": (2, 0, 0),
|
||||
"rq": (0, 6),
|
||||
"sanic": (0, 8),
|
||||
"sqlalchemy": (1, 2),
|
||||
"starlette": (0, 16),
|
||||
"starlite": (1, 48),
|
||||
"statsig": (0, 55, 3),
|
||||
"strawberry": (0, 209, 5),
|
||||
"tornado": (6, 0),
|
||||
"typer": (0, 15),
|
||||
"unleash": (6, 0, 1),
|
||||
}
|
||||
|
||||
|
||||
def setup_integrations(
|
||||
integrations,
|
||||
with_defaults=True,
|
||||
with_auto_enabling_integrations=False,
|
||||
disabled_integrations=None,
|
||||
):
|
||||
# type: (Sequence[Integration], bool, bool, Optional[Sequence[Union[type[Integration], Integration]]]) -> Dict[str, Integration]
|
||||
"""
|
||||
Given a list of integration instances, this installs them all.
|
||||
|
||||
When `with_defaults` is set to `True` all default integrations are added
|
||||
unless they were already provided before.
|
||||
|
||||
`disabled_integrations` takes precedence over `with_defaults` and
|
||||
`with_auto_enabling_integrations`.
|
||||
"""
|
||||
integrations = dict(
|
||||
(integration.identifier, integration) for integration in integrations or ()
|
||||
)
|
||||
|
||||
logger.debug("Setting up integrations (with default = %s)", with_defaults)
|
||||
|
||||
# Integrations that will not be enabled
|
||||
disabled_integrations = [
|
||||
integration if isinstance(integration, type) else type(integration)
|
||||
for integration in disabled_integrations or []
|
||||
]
|
||||
|
||||
# Integrations that are not explicitly set up by the user.
|
||||
used_as_default_integration = set()
|
||||
|
||||
if with_defaults:
|
||||
for integration_cls in iter_default_integrations(
|
||||
with_auto_enabling_integrations
|
||||
):
|
||||
if integration_cls.identifier not in integrations:
|
||||
instance = integration_cls()
|
||||
integrations[instance.identifier] = instance
|
||||
used_as_default_integration.add(instance.identifier)
|
||||
|
||||
for identifier, integration in integrations.items():
|
||||
with _installer_lock:
|
||||
if identifier not in _processed_integrations:
|
||||
if type(integration) in disabled_integrations:
|
||||
logger.debug("Ignoring integration %s", identifier)
|
||||
else:
|
||||
logger.debug(
|
||||
"Setting up previously not enabled integration %s", identifier
|
||||
)
|
||||
try:
|
||||
type(integration).setup_once()
|
||||
except DidNotEnable as e:
|
||||
if identifier not in used_as_default_integration:
|
||||
raise
|
||||
|
||||
logger.debug(
|
||||
"Did not enable default integration %s: %s", identifier, e
|
||||
)
|
||||
else:
|
||||
_installed_integrations.add(identifier)
|
||||
|
||||
_processed_integrations.add(identifier)
|
||||
|
||||
integrations = {
|
||||
identifier: integration
|
||||
for identifier, integration in integrations.items()
|
||||
if identifier in _installed_integrations
|
||||
}
|
||||
|
||||
for identifier in integrations:
|
||||
logger.debug("Enabling integration %s", identifier)
|
||||
|
||||
return integrations
|
||||
|
||||
|
||||
def _check_minimum_version(integration, version, package=None):
|
||||
# type: (type[Integration], Optional[tuple[int, ...]], Optional[str]) -> None
|
||||
package = package or integration.identifier
|
||||
|
||||
if version is None:
|
||||
raise DidNotEnable(f"Unparsable {package} version.")
|
||||
|
||||
min_version = _MIN_VERSIONS.get(integration.identifier)
|
||||
if min_version is None:
|
||||
return
|
||||
|
||||
if version < min_version:
|
||||
raise DidNotEnable(
|
||||
f"Integration only supports {package} {'.'.join(map(str, min_version))} or newer."
|
||||
)
|
||||
|
||||
|
||||
class DidNotEnable(Exception): # noqa: N818
|
||||
"""
|
||||
The integration could not be enabled due to a trivial user error like
|
||||
`flask` not being installed for the `FlaskIntegration`.
|
||||
|
||||
This exception is silently swallowed for default integrations, but reraised
|
||||
for explicitly enabled integrations.
|
||||
"""
|
||||
|
||||
|
||||
class Integration(ABC):
|
||||
"""Baseclass for all integrations.
|
||||
|
||||
To accept options for an integration, implement your own constructor that
|
||||
saves those options on `self`.
|
||||
"""
|
||||
|
||||
install = None
|
||||
"""Legacy method, do not implement."""
|
||||
|
||||
identifier = None # type: str
|
||||
"""String unique ID of integration type"""
|
||||
|
||||
@staticmethod
|
||||
@abstractmethod
|
||||
def setup_once():
|
||||
# type: () -> None
|
||||
"""
|
||||
Initialize the integration.
|
||||
|
||||
This function is only called once, ever. Configuration is not available
|
||||
at this point, so the only thing to do here is to hook into exception
|
||||
handlers, and perhaps do monkeypatches.
|
||||
|
||||
Inside those hooks `Integration.current` can be used to access the
|
||||
instance again.
|
||||
"""
|
||||
pass
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user