2025-12-01
This commit is contained in:
@@ -0,0 +1,674 @@
|
||||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU General Public License is a free, copyleft license for
|
||||
software and other kinds of works.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
the GNU General Public License is intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users. We, the Free Software Foundation, use the
|
||||
GNU General Public License for most of our software; it applies also to
|
||||
any other work released this way by its authors. You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to prevent others from denying you
|
||||
these rights or asking you to surrender the rights. Therefore, you have
|
||||
certain responsibilities if you distribute copies of the software, or if
|
||||
you modify it: responsibilities to respect the freedom of others.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must pass on to the recipients the same
|
||||
freedoms that you received. You must make sure that they, too, receive
|
||||
or can get the source code. And you must show them these terms so they
|
||||
know their rights.
|
||||
|
||||
Developers that use the GNU GPL protect your rights with two steps:
|
||||
(1) assert copyright on the software, and (2) offer you this License
|
||||
giving you legal permission to copy, distribute and/or modify it.
|
||||
|
||||
For the developers' and authors' protection, the GPL clearly explains
|
||||
that there is no warranty for this free software. For both users' and
|
||||
authors' sake, the GPL requires that modified versions be marked as
|
||||
changed, so that their problems will not be attributed erroneously to
|
||||
authors of previous versions.
|
||||
|
||||
Some devices are designed to deny users access to install or run
|
||||
modified versions of the software inside them, although the manufacturer
|
||||
can do so. This is fundamentally incompatible with the aim of
|
||||
protecting users' freedom to change the software. The systematic
|
||||
pattern of such abuse occurs in the area of products for individuals to
|
||||
use, which is precisely where it is most unacceptable. Therefore, we
|
||||
have designed this version of the GPL to prohibit the practice for those
|
||||
products. If such problems arise substantially in other domains, we
|
||||
stand ready to extend this provision to those domains in future versions
|
||||
of the GPL, as needed to protect the freedom of users.
|
||||
|
||||
Finally, every program is threatened constantly by software patents.
|
||||
States should not allow patents to restrict development and use of
|
||||
software on general-purpose computers, but in those that do, we wish to
|
||||
avoid the special danger that patents applied to a free program could
|
||||
make it effectively proprietary. To prevent this, the GPL assures that
|
||||
patents cannot be used to render the program non-free.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Use with the GNU Affero General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU Affero General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the special requirements of the GNU Affero General Public License,
|
||||
section 13, concerning interaction through a network will apply to the
|
||||
combination as such.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
PlaySync
|
||||
Copyright (C) 2020 Blender
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program does terminal interaction, make it output a short
|
||||
notice like this when it starts in an interactive mode:
|
||||
|
||||
PlaySync Copyright (C) 2020 Blender
|
||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, your program's commands
|
||||
might be different; for a GUI interface, you would use an "about box".
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU GPL, see
|
||||
<http://www.gnu.org/licenses/>.
|
||||
|
||||
The GNU General Public License does not permit incorporating your program
|
||||
into proprietary programs. If your program is a subroutine library, you
|
||||
may consider it more useful to permit linking proprietary applications with
|
||||
the library. If this is what you want to do, use the GNU Lesser General
|
||||
Public License instead of this License. But first, please read
|
||||
<http://www.gnu.org/philosophy/why-not-lgpl.html>.
|
||||
@@ -0,0 +1,4 @@
|
||||
# Cache Manager
|
||||
This add-on streamlines working with alembic caches. This addon is not used in the production of the Blender-Studio anymore and is not maintained actively.
|
||||
|
||||
You can find the documentation [here](https://studio.blender.org/tools/addons/easy_weight).
|
||||
@@ -0,0 +1,63 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
import bpy
|
||||
from . import (
|
||||
cmglobals,
|
||||
logger,
|
||||
cache,
|
||||
models,
|
||||
prefs,
|
||||
propsdata,
|
||||
props,
|
||||
opsdata,
|
||||
ops,
|
||||
ui
|
||||
)
|
||||
|
||||
logg = logger.LoggerFactory.getLogger(__name__)
|
||||
|
||||
bl_info = {
|
||||
"name": "Cache Manager",
|
||||
"author": "Paul Golter",
|
||||
"description": "Blender addon to streamline alembic caches of assets",
|
||||
"blender": (2, 93, 0),
|
||||
"version": (0, 1, 2),
|
||||
"location": "View3D",
|
||||
"warning": "",
|
||||
"doc_url": "",
|
||||
"tracker_url": "",
|
||||
"category": "Generic",
|
||||
}
|
||||
|
||||
_need_reload = "ops" in locals()
|
||||
|
||||
if _need_reload:
|
||||
import importlib
|
||||
|
||||
cmglobals = importlib.reload(cmglobals)
|
||||
logger = importlib.reload(logger)
|
||||
cache = importlib.reload(cache)
|
||||
models = importlib.reload(models)
|
||||
prefs = importlib.reload(prefs)
|
||||
propsdata = importlib.reload(propsdata)
|
||||
props = importlib.reload(props)
|
||||
opsdata = importlib.reload(opsdata)
|
||||
ops = importlib.reload(ops)
|
||||
ui = importlib.reload(ui)
|
||||
|
||||
|
||||
def register():
|
||||
prefs.register()
|
||||
props.register()
|
||||
ops.register()
|
||||
ui.register()
|
||||
logg.info("Registered cache-manager")
|
||||
|
||||
|
||||
def unregister():
|
||||
ui.unregister()
|
||||
ops.unregister()
|
||||
props.unregister()
|
||||
prefs.unregister()
|
||||
@@ -0,0 +1,19 @@
|
||||
schema_version = "1.0.0"
|
||||
|
||||
id = "cache_manager"
|
||||
version = "0.1.2"
|
||||
name = "Cache Manager"
|
||||
tagline = "Streamline working with alembic caches"
|
||||
maintainer = "Blender Studio"
|
||||
type = "add-on"
|
||||
website = "https://studio.blender.org/tools/addons/cache_manager"
|
||||
tags = ["Bake"]
|
||||
|
||||
blender_version_min = "4.2.0"
|
||||
|
||||
license = [
|
||||
"SPDX:GPL-3.0-or-later",
|
||||
]
|
||||
copyright = [
|
||||
"2019-2025 Paul Golter & Blender Studio",
|
||||
]
|
||||
@@ -0,0 +1,861 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
import json
|
||||
import contextlib
|
||||
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Any, Union, Optional, Dict
|
||||
from copy import deepcopy
|
||||
|
||||
import bpy
|
||||
|
||||
from . import propsdata, cmglobals, opsdata
|
||||
from .logger import LoggerFactory, log_new_lines
|
||||
|
||||
logger = LoggerFactory.getLogger(__name__)
|
||||
|
||||
|
||||
def is_valid_cache_object(obj: bpy.types.Object) -> bool:
|
||||
if obj.type not in cmglobals.VALID_OBJECT_TYPES:
|
||||
return False
|
||||
|
||||
if obj.type == "CAMERA":
|
||||
return True
|
||||
|
||||
if obj.type == "EMPTY":
|
||||
return True
|
||||
|
||||
if obj.type == "LATTICE":
|
||||
return True
|
||||
|
||||
return obj.name.startswith("GEO")
|
||||
|
||||
|
||||
def is_valid_cache_coll(coll: bpy.types.Collection) -> bool:
|
||||
if opsdata.is_item_local(coll) and not bpy.data.filepath:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def get_valid_cache_objects(collection: bpy.types.Collection) -> List[bpy.types.Object]:
|
||||
object_list = [obj for obj in collection.all_objects if is_valid_cache_object(obj)]
|
||||
return object_list
|
||||
|
||||
|
||||
def get_current_time_string(date_format: str) -> str:
|
||||
now = datetime.now()
|
||||
current_time_string = now.strftime(date_format)
|
||||
return current_time_string
|
||||
|
||||
|
||||
def get_ref_coll_by_name(coll_name: str) -> bpy.types.Collection:
|
||||
coll = bpy.data.collections[coll_name]
|
||||
|
||||
if not coll.override_library:
|
||||
return coll
|
||||
|
||||
return coll.override_library.reference
|
||||
|
||||
|
||||
def get_ref_coll(coll: bpy.types.Collection) -> bpy.types.Collection:
|
||||
if not coll.override_library:
|
||||
return coll
|
||||
|
||||
return coll.override_library.reference
|
||||
|
||||
|
||||
def read_json(filepath: Path) -> Any:
|
||||
with open(filepath.as_posix(), "r") as file:
|
||||
json_dict = json.loads(file.read())
|
||||
return json_dict
|
||||
|
||||
|
||||
def save_as_json(data: Any, filepath: Path) -> None:
|
||||
with open(filepath.as_posix(), "w+") as file:
|
||||
json.dump(data, file, indent=2)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def temporary_current_frame(context):
|
||||
"""Allows the context to set the scene current frame, restores it on exit.
|
||||
|
||||
Yields the initial current frame, so it can be used for reference in the context.
|
||||
"""
|
||||
current_frame = context.scene.frame_current
|
||||
try:
|
||||
yield current_frame
|
||||
finally:
|
||||
context.scene.frame_current = current_frame
|
||||
|
||||
|
||||
class CacheConfig:
|
||||
def __init__(self, filepath: Path):
|
||||
self.filepath: Path = filepath
|
||||
self._load(filepath)
|
||||
|
||||
def _load(self, filepath: Path) -> None:
|
||||
self._json_obj: Dict[str, Any] = read_json(self.filepath)
|
||||
self.filepath = filepath
|
||||
logger.info("Loaded cacheconfig from: %s", filepath.as_posix())
|
||||
|
||||
@property
|
||||
def json_obj(self) -> Any:
|
||||
return self._json_obj
|
||||
|
||||
# Meta.
|
||||
|
||||
def get_meta(self) -> Dict[str, Any]:
|
||||
return deepcopy(self._json_obj["meta"])
|
||||
|
||||
def get_meta_key(self, key: str) -> Any:
|
||||
return self._json_obj["meta"][key]
|
||||
|
||||
# Libfiles / Collections.
|
||||
def get_all_libfiles(self):
|
||||
return self._json_obj["libs"].keys()
|
||||
|
||||
def get_all_coll_ref_names(self, libfile: str) -> List[str]:
|
||||
return sorted(
|
||||
self._json_obj["libs"][libfile]["data_from"]["collections"].keys()
|
||||
)
|
||||
|
||||
def get_cachefile(self, libfile: str, coll_ref_name: str, variant: str) -> str:
|
||||
return self._json_obj["libs"][libfile]["data_from"]["collections"][
|
||||
coll_ref_name
|
||||
][variant]["cachefile"]
|
||||
|
||||
def get_all_collvariants(self, libfile: str, coll_ref_name: str) -> Dict[str, Any]:
|
||||
return deepcopy(
|
||||
self._json_obj["libs"][libfile]["data_from"]["collections"][coll_ref_name]
|
||||
)
|
||||
|
||||
# Remapping.
|
||||
def get_coll_to_lib_mapping(self) -> Dict[str, str]:
|
||||
remapping = {}
|
||||
for libfile in self._json_obj["libs"]:
|
||||
for coll_str in self._json_obj["libs"][libfile]["data_from"]["collections"]:
|
||||
for variant_name in self._json_obj["libs"][libfile]["data_from"][
|
||||
"collections"
|
||||
][coll_str]:
|
||||
remapping[variant_name] = libfile
|
||||
return remapping
|
||||
|
||||
# Objs / Cams.
|
||||
def get_animation_data(self, obj_category: str) -> Dict[str, Any]:
|
||||
return deepcopy(self._json_obj[obj_category])
|
||||
|
||||
def get_all_obj_names(self, obj_category: str) -> List[str]:
|
||||
return sorted(self._json_obj[obj_category].keys())
|
||||
|
||||
def get_obj(self, obj_category: str, obj_name: str) -> Optional[Dict[str, Any]]:
|
||||
try:
|
||||
anim_obj_dict = self._json_obj[obj_category][obj_name]
|
||||
except KeyError:
|
||||
logger.error(
|
||||
"%s not found in cacheconfig.",
|
||||
obj_name,
|
||||
)
|
||||
return None
|
||||
return deepcopy(anim_obj_dict)
|
||||
|
||||
def get_all_data_paths(self, obj_category: str, obj_name: str) -> List[str]:
|
||||
return self._json_obj[obj_category][obj_name]["data_paths"].keys()
|
||||
|
||||
def get_all_data_path_values(
|
||||
self, obj_category: str, obj_name: str, data_path: str
|
||||
) -> List[Any]:
|
||||
return deepcopy(
|
||||
self._json_obj[obj_category][obj_name]["data_paths"][data_path]["value"]
|
||||
)
|
||||
|
||||
def get_data_path_value(
|
||||
self, obj_category: str, obj_name: str, data_path: str, frame: int
|
||||
) -> Any:
|
||||
return self._json_obj[obj_category][obj_name]["data_paths"][data_path]["value"][
|
||||
frame
|
||||
]
|
||||
|
||||
def get_abc_obj_path(self, obj_name: str):
|
||||
try:
|
||||
abc_path = self._json_obj["objects"][obj_name]["abc_obj_path"]
|
||||
except KeyError:
|
||||
logger.error(
|
||||
"%s not found in cacheconfig. Failed to get abc obj cache path.",
|
||||
obj_name,
|
||||
)
|
||||
return None
|
||||
|
||||
return abc_path
|
||||
|
||||
|
||||
class CacheConfigBlueprint(CacheConfig):
|
||||
_CACHECONFIG_TEMPL: Dict[str, Any] = {
|
||||
"meta": {},
|
||||
"libs": {},
|
||||
"objects": {},
|
||||
"cameras": {},
|
||||
}
|
||||
_LIBDICT_TEMPL: Dict[str, Any] = {
|
||||
"data_from": {"collections": {}}, # {'colname': {'cachefile': cachepath}}
|
||||
}
|
||||
_OBJ_DICT_TEMPL: Dict[str, Any] = {"type": "", "abc_obj_path": "", "data_paths": {}}
|
||||
_DATA_PATH_DICT: Dict[str, List[Any]] = {"value": []}
|
||||
|
||||
def __init__(self):
|
||||
self._json_obj: Dict[str, Any] = deepcopy(self._CACHECONFIG_TEMPL)
|
||||
|
||||
def init_by_file(self, filepath: Path) -> None:
|
||||
self._json_obj = read_json(filepath)
|
||||
|
||||
def save_as_cacheconfig(self, filepath: Path) -> None:
|
||||
save_as_json(self._json_obj, filepath)
|
||||
|
||||
# Meta.
|
||||
|
||||
def set_meta_key(self, key: str, value: Any) -> None:
|
||||
self._json_obj["meta"][key] = value
|
||||
|
||||
# Lib.
|
||||
def _ensure_lib(self, libfile: str) -> None:
|
||||
self._json_obj["libs"].setdefault(libfile, deepcopy(self._LIBDICT_TEMPL))
|
||||
|
||||
# Collection.
|
||||
def _ensure_coll_ref(self, libfile: str, coll_ref_name: str) -> None:
|
||||
self._json_obj["libs"][libfile]["data_from"]["collections"].setdefault(
|
||||
coll_ref_name, {}
|
||||
)
|
||||
|
||||
def _ensure_coll_variant(
|
||||
self, libfile: str, coll_ref_name: str, coll_var_name: str
|
||||
) -> None:
|
||||
self._json_obj["libs"][libfile]["data_from"]["collections"][
|
||||
coll_ref_name
|
||||
].setdefault(coll_var_name, {})
|
||||
|
||||
def set_coll_variant(
|
||||
self,
|
||||
libfile: str,
|
||||
coll_ref_name: str,
|
||||
coll_var_name: str,
|
||||
coll_dict: Dict[str, Any],
|
||||
) -> None:
|
||||
|
||||
self._ensure_lib(libfile)
|
||||
self._ensure_coll_ref(libfile, coll_ref_name)
|
||||
self._ensure_coll_variant(libfile, coll_ref_name, coll_var_name)
|
||||
|
||||
self.json_obj["libs"][libfile]["data_from"]["collections"][coll_ref_name][
|
||||
coll_var_name
|
||||
] = coll_dict
|
||||
|
||||
# Objs / Cameras.
|
||||
def _ensure_obj(self, obj_category: str, obj_name: str) -> None:
|
||||
self._json_obj[obj_category].setdefault(
|
||||
obj_name, deepcopy(self._OBJ_DICT_TEMPL)
|
||||
)
|
||||
|
||||
def set_obj_key(
|
||||
self, obj_category: str, obj_name: str, key: str, value: Any
|
||||
) -> None:
|
||||
|
||||
self._ensure_obj(obj_category, obj_name)
|
||||
self._json_obj[obj_category][obj_name][key] = value
|
||||
|
||||
def add_obj_data_path(
|
||||
self, obj_category: str, obj_name: str, data_path: str
|
||||
) -> None:
|
||||
self._ensure_obj(obj_category, obj_name)
|
||||
self._json_obj[obj_category][obj_name]["data_paths"][data_path] = deepcopy(
|
||||
self._DATA_PATH_DICT
|
||||
)
|
||||
|
||||
def append_value_to_data_path(
|
||||
self, obj_category: str, obj_name: str, data_path: str, value: Any
|
||||
) -> None:
|
||||
# Otherwise json will throw an error, tuple is supported by blender.
|
||||
if type(value).__name__ == "bpy_prop_array":
|
||||
value = tuple(value)
|
||||
|
||||
self._json_obj[obj_category][obj_name]["data_paths"][data_path]["value"].append(
|
||||
value
|
||||
)
|
||||
|
||||
def get_data_path_dict_templ(self) -> Dict[str, Any]:
|
||||
return deepcopy(self._DRIVERDICT_TEMPL)
|
||||
|
||||
|
||||
class CacheConfigProcessor:
|
||||
@classmethod
|
||||
def import_collections(
|
||||
cls, cacheconfig: CacheConfig, context: bpy.types.Context, link: bool = True
|
||||
) -> List[bpy.types.Collection]:
|
||||
|
||||
# Link collections in bpy.data of this blend file.
|
||||
cls._import_data_from_libfiles(cacheconfig, link=link)
|
||||
|
||||
# Create.
|
||||
colls = cls._instance_colls_to_scene_and_override(cacheconfig, context)
|
||||
return colls
|
||||
|
||||
@classmethod
|
||||
def _import_data_from_libfiles(
|
||||
cls, cacheconfig: CacheConfig, link: bool = True
|
||||
) -> None:
|
||||
|
||||
noun = "Appended"
|
||||
if link:
|
||||
noun = "Linked"
|
||||
|
||||
for libfile in cacheconfig.get_all_libfiles():
|
||||
|
||||
libpath = Path(libfile)
|
||||
|
||||
with bpy.data.libraries.load(
|
||||
libpath.as_posix(), relative=True, link=link
|
||||
) as (
|
||||
data_from,
|
||||
data_to,
|
||||
):
|
||||
|
||||
for coll_name in cacheconfig.get_all_coll_ref_names(libfile):
|
||||
|
||||
if coll_name not in data_from.collections:
|
||||
logger.error(
|
||||
"Failed to import collection %s from %s. Doesn't exist in file.",
|
||||
coll_name,
|
||||
libpath.as_posix(),
|
||||
)
|
||||
continue
|
||||
|
||||
if coll_name in data_to.collections:
|
||||
logger.info("Collection %s already in blendfile.", coll_name)
|
||||
continue
|
||||
|
||||
data_to.collections.append(coll_name)
|
||||
logger.info(
|
||||
"%s collection: %s from library: %s",
|
||||
noun,
|
||||
coll_name,
|
||||
libpath.as_posix(),
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def _instance_colls_to_scene_and_override(
|
||||
cls, cacheconfig: CacheConfig, context: bpy.types.Context
|
||||
) -> List[bpy.types.Collection]:
|
||||
# List of collections to track which ones got imported.
|
||||
colls: List[bpy.types.Collection] = []
|
||||
|
||||
for libfile in cacheconfig.get_all_libfiles():
|
||||
|
||||
# Link collections in current scene and add cm.cachfile property.
|
||||
for coll_name in cacheconfig.get_all_coll_ref_names(libfile):
|
||||
|
||||
# For each variant add instance object.
|
||||
for variant_name in sorted(
|
||||
cacheconfig.get_all_collvariants(libfile, coll_name)
|
||||
):
|
||||
if cls._is_coll_variant_in_blend(variant_name):
|
||||
logger.info("Collection %s already exists. Skip.", variant_name)
|
||||
continue
|
||||
|
||||
logger.info(
|
||||
"Collection variant %s does not exist yet. Will create.",
|
||||
variant_name,
|
||||
)
|
||||
|
||||
# Get source collection and create collection instance of it.
|
||||
source_collection = get_ref_coll_by_name(coll_name)
|
||||
instance_obj = cls._create_collection_instance(
|
||||
source_collection, variant_name
|
||||
)
|
||||
|
||||
# Add library override to collection inst.
|
||||
cls._make_library_override(instance_obj, context)
|
||||
|
||||
# Add collection properties.
|
||||
|
||||
coll = bpy.data.collections[variant_name, None]
|
||||
# TODO: Super risky but I found no other way around this
|
||||
# we have no influence on the naming of objects that will be created
|
||||
# by bpy.ops.object.make_override_library() -> we can just hope here
|
||||
# that there is not other object that would mess up the incrementation
|
||||
# -> cache would not work anymore with wrong incrementation.
|
||||
cachefile = cacheconfig.get_cachefile(
|
||||
libfile, coll_name, variant_name
|
||||
)
|
||||
|
||||
# Set cm.cachefile property.
|
||||
coll.cm.cachefile = cachefile
|
||||
opsdata.add_coll_to_cache_collections(context, coll, "IMPORT")
|
||||
colls.append(coll)
|
||||
|
||||
logger.info(
|
||||
"%s assigned cachefile: %s (variant: %s)",
|
||||
coll.name,
|
||||
cachefile,
|
||||
variant_name,
|
||||
)
|
||||
|
||||
return sorted(colls, key=lambda x: x.name)
|
||||
|
||||
@classmethod
|
||||
def _is_coll_variant_in_blend(cls, variant_name: str) -> bool:
|
||||
# Check if variant already in this blend file.
|
||||
try:
|
||||
coll = bpy.data.collections[variant_name, None]
|
||||
except KeyError:
|
||||
return False
|
||||
else:
|
||||
# Collection already exists, not continuing would add another
|
||||
# collection instance which then gets overwritten which results
|
||||
# in an increase of object inrementation > caches wont work.
|
||||
if coll.library:
|
||||
return False
|
||||
return True
|
||||
|
||||
@classmethod
|
||||
def _create_collection_instance(
|
||||
cls, source_collection: bpy.types.Collection, variant_name: str
|
||||
) -> bpy.types.Object:
|
||||
# Variant name has no effect how the overwritten library collection in the end
|
||||
# will be named is supplied here just for loggin purposes.
|
||||
|
||||
# Use empty to instance source collection.
|
||||
instance_obj = bpy.data.objects.new(name=variant_name, object_data=None)
|
||||
instance_obj.instance_collection = source_collection
|
||||
instance_obj.instance_type = "COLLECTION"
|
||||
|
||||
parent_collection = bpy.context.view_layer.active_layer_collection
|
||||
parent_collection.collection.objects.link(instance_obj)
|
||||
|
||||
logger.info(
|
||||
"Instanced collection: %s as: %s (variant: %s)",
|
||||
source_collection.name,
|
||||
instance_obj.name,
|
||||
variant_name,
|
||||
)
|
||||
|
||||
return instance_obj
|
||||
|
||||
@classmethod
|
||||
def _make_library_override(
|
||||
cls, instance_obj: bpy.types.Object, context: bpy.types.Context
|
||||
) -> None:
|
||||
log_name = instance_obj.name
|
||||
# Deselect all.
|
||||
bpy.ops.object.select_all(action="DESELECT")
|
||||
|
||||
# Needs active object (coll instance).
|
||||
context.view_layer.objects.active = instance_obj
|
||||
instance_obj.select_set(True)
|
||||
|
||||
# Add lib override.
|
||||
bpy.ops.object.make_override_library()
|
||||
|
||||
logger.info(
|
||||
"%s make library override.",
|
||||
log_name,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def import_animation_data(
|
||||
cls, cacheconfig: CacheConfig, colls: List[bpy.types.Collection]
|
||||
) -> None:
|
||||
|
||||
colls = sorted(colls, key=lambda x: x.name)
|
||||
frame_in = cacheconfig.get_meta_key("frame_start")
|
||||
frame_out = cacheconfig.get_meta_key("frame_end")
|
||||
|
||||
log_new_lines(1)
|
||||
logger.info("-START- Importing Animation Data %i - %i", frame_in, frame_out)
|
||||
|
||||
objs_load_anim: List[bpy.types.Object] = []
|
||||
cams_laod_anim: List[bpy.types.Camera] = []
|
||||
|
||||
# Gather all objects to load anim on.
|
||||
for coll in colls:
|
||||
for obj in coll.all_objects:
|
||||
if not is_valid_cache_object(obj):
|
||||
continue
|
||||
|
||||
if obj.type == "CAMERA":
|
||||
cams_laod_anim.append(obj.data)
|
||||
continue
|
||||
|
||||
objs_load_anim.append(obj)
|
||||
|
||||
# Extend object list with cameras.
|
||||
objs_load_anim.extend(cams_laod_anim)
|
||||
|
||||
# Import animation data for objects.
|
||||
cls._import_animation_data_objects(cacheconfig, objs_load_anim)
|
||||
|
||||
log_new_lines(1)
|
||||
logger.info("-END- Importing Animation Data")
|
||||
|
||||
@classmethod
|
||||
def _import_animation_data_objects(
|
||||
cls,
|
||||
cacheconfig: CacheConfig,
|
||||
objects: List[Union[bpy.types.Object, bpy.types.Camera]],
|
||||
) -> None:
|
||||
|
||||
frame_in = cacheconfig.get_meta_key("frame_start")
|
||||
frame_out = cacheconfig.get_meta_key("frame_end")
|
||||
|
||||
# Check if obj in collection is in cacheconfig
|
||||
# if so key all data paths with the value from cacheconfig.
|
||||
|
||||
for obj in objects:
|
||||
|
||||
obj_category = "objects"
|
||||
if obj.type in cmglobals.CAMERA_TYPES:
|
||||
obj_category = "cameras"
|
||||
|
||||
obj_name = obj.name
|
||||
obj_dict = cacheconfig.get_obj(obj_category, obj_name)
|
||||
|
||||
if not obj_dict:
|
||||
continue
|
||||
|
||||
anim_props_list = [] # for log
|
||||
muted_drivers = [] # for log
|
||||
|
||||
# Get property that was driven and set keyframes.
|
||||
for data_path in cacheconfig.get_all_data_paths(obj_category, obj_name):
|
||||
|
||||
# Disable drivers.
|
||||
muted_drivers.extend(
|
||||
opsdata.disable_drivers_by_data_path([obj], data_path)
|
||||
)
|
||||
|
||||
# For log.
|
||||
anim_props_list.append(data_path)
|
||||
|
||||
# Insert keyframe for frames in json_obj.
|
||||
for frame in range(frame_in, frame_out + 1):
|
||||
|
||||
# Get value to set prop to.
|
||||
prop_value = cacheconfig.get_data_path_value(
|
||||
obj_category, obj_name, data_path, frame - frame_in
|
||||
)
|
||||
|
||||
# Pack string prop in "" so exec works.
|
||||
if type(prop_value) == str:
|
||||
prop_value = f'"{prop_value}"'
|
||||
|
||||
# Get right delimeter.
|
||||
deliminater = "."
|
||||
if data_path.startswith("["):
|
||||
deliminater = ""
|
||||
|
||||
# Get right data category.
|
||||
command = f'bpy.data.{obj_category}["{obj_name}", None]{deliminater}{data_path}={prop_value}'
|
||||
|
||||
# Set property and insert keyframe.
|
||||
exec(command)
|
||||
obj.keyframe_insert(data_path=data_path, frame=frame)
|
||||
|
||||
if muted_drivers:
|
||||
logger.info(
|
||||
"%s disabled drivers: %s",
|
||||
obj_name,
|
||||
" ,".join([m.data_path for m in muted_drivers]),
|
||||
)
|
||||
if anim_props_list:
|
||||
logger.info(
|
||||
"%s imported animation for data paths: %s",
|
||||
obj_name,
|
||||
" ,".join(anim_props_list),
|
||||
)
|
||||
|
||||
|
||||
class CacheConfigFactory:
|
||||
|
||||
_DATE_FORMAT = "%Y-%m-%dT%H:%M:%S"
|
||||
|
||||
@classmethod
|
||||
def gen_config_from_colls(
|
||||
cls,
|
||||
context: bpy.types.Context,
|
||||
colls: List[bpy.types.Collection],
|
||||
filepath: Path,
|
||||
) -> CacheConfig:
|
||||
|
||||
blueprint = CacheConfigBlueprint()
|
||||
|
||||
colls = sorted(colls, key=lambda x: x.name)
|
||||
|
||||
# If cacheconfig already exists load it and update entries.
|
||||
if filepath.exists():
|
||||
logger.info(
|
||||
"Cacheconfig already exists: %s. Will update entries.",
|
||||
filepath.as_posix(),
|
||||
)
|
||||
blueprint.init_by_file(filepath)
|
||||
|
||||
log_new_lines(2)
|
||||
noun = "Updating" if filepath.exists else "Creating"
|
||||
logger.info("-START- %s CacheConfig", noun)
|
||||
|
||||
# Populate metadata.
|
||||
cls._populate_metadata(context, blueprint)
|
||||
|
||||
# Populate cacheconfig with libs based on collections.
|
||||
cls._populate_libs(context, colls, blueprint)
|
||||
|
||||
# Populate cacheconfig with animation data.
|
||||
objects_with_anim = cls._populate_with_objs(colls, blueprint)
|
||||
|
||||
# Populate cacheconfig with cameras.
|
||||
cams_to_cache = cls._populate_with_cameras(colls, blueprint)
|
||||
|
||||
# Add cameras to objects with anim list.
|
||||
objects_with_anim.extend(cams_to_cache)
|
||||
|
||||
# Get drive values for each frame.
|
||||
cls._store_data_path_values(context, objects_with_anim, blueprint)
|
||||
|
||||
# Save json obj to disk.
|
||||
blueprint.save_as_cacheconfig(filepath)
|
||||
logger.info("Generated cacheconfig and saved to: %s", filepath.as_posix())
|
||||
|
||||
log_new_lines(1)
|
||||
logger.info("-END- %s CacheConfig", noun)
|
||||
|
||||
return CacheConfig(filepath)
|
||||
|
||||
@classmethod
|
||||
def _populate_metadata(
|
||||
cls, context: bpy.types.Context, blueprint: CacheConfigBlueprint
|
||||
) -> CacheConfigBlueprint:
|
||||
|
||||
blueprint.set_meta_key(
|
||||
"blendfile",
|
||||
Path(bpy.data.filepath).absolute().as_posix()
|
||||
if bpy.data.filepath
|
||||
else "unsaved_blendfile",
|
||||
)
|
||||
|
||||
blueprint.set_meta_key(
|
||||
"name",
|
||||
Path(bpy.data.filepath).name if bpy.data.filepath else "unsaved_blendfile",
|
||||
)
|
||||
|
||||
if not "creation_date" in blueprint.get_meta():
|
||||
blueprint.set_meta_key(
|
||||
"creation_date", get_current_time_string(cls._DATE_FORMAT)
|
||||
)
|
||||
|
||||
blueprint.set_meta_key("updated_at", get_current_time_string(cls._DATE_FORMAT))
|
||||
|
||||
blueprint.set_meta_key("frame_start", context.scene.frame_start)
|
||||
|
||||
blueprint.set_meta_key("frame_end", context.scene.frame_end)
|
||||
|
||||
logger.info("Created metadata")
|
||||
return blueprint
|
||||
|
||||
@classmethod
|
||||
def _populate_libs(
|
||||
cls,
|
||||
context: bpy.types.Context,
|
||||
colls: List[bpy.types.Collection],
|
||||
blueprint: CacheConfigBlueprint,
|
||||
) -> CacheConfigBlueprint:
|
||||
|
||||
colls = sorted(colls, key=lambda x: x.name)
|
||||
|
||||
# Get libraries.
|
||||
for coll in colls:
|
||||
|
||||
libfile = opsdata.get_item_libfile(coll)
|
||||
coll_ref = get_ref_coll(coll)
|
||||
|
||||
# Create collection dict based on this variant collection.
|
||||
_coll_dict = {
|
||||
"cachefile": propsdata.gen_cachepath_collection(
|
||||
coll, context
|
||||
).as_posix(),
|
||||
}
|
||||
|
||||
# Set blueprint coll variant.
|
||||
blueprint.set_coll_variant(libfile, coll_ref.name, coll.name, _coll_dict)
|
||||
|
||||
# Log.
|
||||
for libfile in blueprint.get_all_libfiles():
|
||||
logger.info(
|
||||
"Gathered libfile: %s with collections: %s",
|
||||
libfile,
|
||||
", ".join(blueprint.get_all_coll_ref_names(libfile)),
|
||||
)
|
||||
|
||||
return blueprint
|
||||
|
||||
@classmethod
|
||||
def _populate_with_objs(
|
||||
cls,
|
||||
colls: List[bpy.types.Collection],
|
||||
blueprint: CacheConfigBlueprint,
|
||||
) -> List[bpy.types.Object]:
|
||||
|
||||
objects_with_anim: List[bpy.types.Object] = []
|
||||
|
||||
for coll in colls:
|
||||
|
||||
obj_category = "objects"
|
||||
|
||||
# Loop over all objects in that collection.
|
||||
for obj in coll.all_objects:
|
||||
|
||||
is_anim = False
|
||||
|
||||
if not is_valid_cache_object(obj):
|
||||
continue
|
||||
|
||||
# Set abc_obj_path.
|
||||
blueprint.set_obj_key(
|
||||
obj_category,
|
||||
obj.name,
|
||||
"abc_obj_path",
|
||||
str(opsdata.gen_abc_object_path(obj)),
|
||||
)
|
||||
|
||||
# Set type.
|
||||
blueprint.set_obj_key(obj_category, obj.name, "type", str(obj.type))
|
||||
|
||||
if not obj.animation_data:
|
||||
continue
|
||||
|
||||
if not obj.animation_data.drivers:
|
||||
continue
|
||||
|
||||
# For now we only write data paths that are driven,
|
||||
# TODO: detect properties that have an animation or are driven.
|
||||
for driver in obj.animation_data.drivers:
|
||||
|
||||
# Seems to be an override resync issue that old datapaths are sill in .drivers
|
||||
# even tough they don't exist anymore, filter them out like this:.
|
||||
try:
|
||||
obj.path_resolve(driver.data_path)
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
# Don't export animation for vis of modifiers.
|
||||
data_path = driver.data_path.split(".")
|
||||
|
||||
if len(data_path) > 1:
|
||||
if data_path[0].startswith("modifiers"):
|
||||
if data_path[-1] in cmglobals.DRIVER_VIS_DATA_PATHS:
|
||||
continue
|
||||
|
||||
# Add data path of driver to obj data pats dict.
|
||||
blueprint.add_obj_data_path(
|
||||
obj_category, obj.name, driver.data_path
|
||||
)
|
||||
|
||||
if not is_anim:
|
||||
is_anim = True
|
||||
|
||||
if is_anim:
|
||||
objects_with_anim.append(obj)
|
||||
# Log.
|
||||
logger.info("Populated CacheConfig with animated properties.")
|
||||
|
||||
return objects_with_anim
|
||||
|
||||
@classmethod
|
||||
def _populate_with_cameras(
|
||||
cls,
|
||||
colls: List[bpy.types.Collection],
|
||||
blueprint: CacheConfigBlueprint,
|
||||
) -> List[bpy.types.Camera]:
|
||||
|
||||
obj_category = "cameras"
|
||||
cams_to_cache: List[bpy.types.Camera] = []
|
||||
|
||||
for cam in bpy.data.cameras:
|
||||
|
||||
if opsdata.is_item_local(cam) and not bpy.data.filepath:
|
||||
logger.error(
|
||||
"Failed to add local camera %s to cacheconfig. Blend files needs to be saved.",
|
||||
cam.name,
|
||||
)
|
||||
continue
|
||||
|
||||
if opsdata.is_item_lib_source(cam):
|
||||
logger.error(
|
||||
"Failed to add library data camera %s to cacheconfig. Skip.",
|
||||
cam.name,
|
||||
)
|
||||
continue
|
||||
|
||||
libfile = opsdata.get_item_libfile(cam)
|
||||
|
||||
# Make sure to only export cams that are in current cache collections.
|
||||
if libfile not in blueprint.get_all_libfiles():
|
||||
continue
|
||||
|
||||
# Set type.
|
||||
blueprint.set_obj_key(obj_category, cam.name, "type", str(cam.type))
|
||||
|
||||
cams_to_cache.append(cam)
|
||||
|
||||
for data_path in cmglobals.CAM_DATA_PATHS:
|
||||
blueprint.add_obj_data_path(obj_category, cam.name, data_path)
|
||||
|
||||
logger.info("Populated CacheConfig with cameras.")
|
||||
|
||||
return cams_to_cache
|
||||
|
||||
@classmethod
|
||||
def _store_data_path_values(
|
||||
cls,
|
||||
context: bpy.types.Context,
|
||||
objects: List[bpy.types.Object],
|
||||
blueprint: CacheConfigBlueprint,
|
||||
) -> CacheConfigBlueprint:
|
||||
|
||||
# Get driver values for each frame.
|
||||
fin = context.scene.frame_start
|
||||
fout = context.scene.frame_end
|
||||
frame_range = range(fin, fout + 1)
|
||||
|
||||
with temporary_current_frame(context) as original_curframe:
|
||||
for frame in frame_range:
|
||||
context.scene.frame_set(frame)
|
||||
logger.info("Storing animation data for frame %i", frame)
|
||||
|
||||
for obj in objects:
|
||||
obj_category = "objects"
|
||||
if obj.type in cmglobals.CAMERA_TYPES:
|
||||
obj_category = "cameras"
|
||||
|
||||
for data_path in blueprint.get_all_data_paths(
|
||||
obj_category, obj.name
|
||||
):
|
||||
data_path_value = obj.path_resolve(data_path)
|
||||
blueprint.append_value_to_data_path(
|
||||
obj_category, obj.name, data_path, data_path_value
|
||||
)
|
||||
|
||||
# Log.
|
||||
logger.info(
|
||||
"Stored data for animated properties (%i, %i).",
|
||||
fin,
|
||||
fout,
|
||||
)
|
||||
return blueprint
|
||||
|
||||
@classmethod
|
||||
def load_config_from_file(cls, filepath: Path) -> CacheConfig:
|
||||
if not filepath.exists():
|
||||
raise ValueError(
|
||||
f"Failed to load config. Path does not exist: {filepath.as_posix()}"
|
||||
)
|
||||
|
||||
return CacheConfig(filepath)
|
||||
@@ -0,0 +1,61 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
from typing import List
|
||||
|
||||
MODIFIER_NAME = "cm_cache"
|
||||
|
||||
CACHE_OFF_SUFFIX = ".cacheoff"
|
||||
CACHE_ON_SUFFIX = ".cacheon"
|
||||
|
||||
CONSTRAINT_NAME = "cm_cache"
|
||||
|
||||
VALID_OBJECT_TYPES = {"MESH", "CAMERA", "EMPTY", "LATTICE"}
|
||||
CAMERA_TYPES = {"PERSP", "ORTHO", "PANO"}
|
||||
|
||||
_VERSION_PATTERN = "v\d\d\d"
|
||||
|
||||
MODIFIERS_KEEP: List[str] = [
|
||||
"SUBSURF",
|
||||
"PARTICLE_SYSTEM",
|
||||
"MESH_SEQUENCE_CACHE",
|
||||
"DATA_TRANSFER",
|
||||
"NORMAL_EDIT",
|
||||
"NODES",
|
||||
]
|
||||
CONSTRAINTS_KEEP: List[str] = [
|
||||
"TRANSFORM_CACHE",
|
||||
]
|
||||
|
||||
DRIVER_VIS_DATA_PATHS: List[str] = [
|
||||
"hide_viewport",
|
||||
"hide_render",
|
||||
"show_viewport",
|
||||
"show_render",
|
||||
]
|
||||
|
||||
CAM_DATA_PATHS: List[str] = [
|
||||
"clip_end",
|
||||
"clip_start",
|
||||
"display_size",
|
||||
"dof.aperture_blades",
|
||||
"dof.aperture_fstop",
|
||||
"dof.aperture_ratio",
|
||||
"dof.aperture_rotation",
|
||||
"dof.focus_distance",
|
||||
"lens",
|
||||
"ortho_scale",
|
||||
"sensor_fit",
|
||||
"sensor_height",
|
||||
"sensor_width",
|
||||
"shift_x",
|
||||
"shift_y",
|
||||
]
|
||||
|
||||
INSTANCE_TYPES: List[str] = ["NONE", "COLLECTION", "VERTS", "FACES"]
|
||||
|
||||
# "lens_unit",
|
||||
# "angle",
|
||||
# "angle_x",
|
||||
# "angle_y",
|
||||
@@ -0,0 +1,26 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
import logging
|
||||
class LoggerFactory:
|
||||
|
||||
"""
|
||||
Utility class to streamline logger creation
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def getLogger(name=__name__):
|
||||
name = name
|
||||
logger = logging.getLogger(name)
|
||||
return logger
|
||||
|
||||
|
||||
logger = LoggerFactory.getLogger(__name__)
|
||||
|
||||
def gen_processing_string(item: str) -> str:
|
||||
return f"---Processing {item}".ljust(50, "-")
|
||||
|
||||
|
||||
def log_new_lines(multiplier: int) -> None:
|
||||
print("\n" * multiplier)
|
||||
@@ -0,0 +1,85 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Optional, List, Tuple
|
||||
|
||||
from .logger import LoggerFactory
|
||||
|
||||
logger = LoggerFactory.getLogger(__name__)
|
||||
|
||||
|
||||
class FolderListModel:
|
||||
def __init__(self):
|
||||
self.__root_path: Optional[Path] = None
|
||||
self.__folders: List[str] = []
|
||||
self.__appended: List[str] = []
|
||||
self.__combined: List[str] = []
|
||||
|
||||
def rowCount(self) -> int:
|
||||
return len(self.__combined)
|
||||
|
||||
def data(self, row: int) -> Optional[str]:
|
||||
if len(self.__combined) > 0:
|
||||
return self.__combined[row]
|
||||
|
||||
return None
|
||||
|
||||
@property
|
||||
def root_path(self) -> Optional[Path]:
|
||||
return self.__root_path
|
||||
|
||||
@root_path.setter
|
||||
def root_path(self, path: Path) -> None:
|
||||
|
||||
if not path or not path.absolute().exists():
|
||||
logger.debug("Invalid path: %s", str(path))
|
||||
self.reset()
|
||||
else:
|
||||
self.__root_path = path
|
||||
logger.debug("FolderListModel root path was set to %s", path.as_posix())
|
||||
self.__load_dir(self.__root_path)
|
||||
|
||||
def reset(self) -> None:
|
||||
self.__root_path = None
|
||||
self.__folders.clear()
|
||||
self.__appended.clear()
|
||||
self.__update_combined()
|
||||
|
||||
def reload(self) -> None:
|
||||
self.__folders.clear()
|
||||
self.__appended.clear()
|
||||
self.root_path = self.__root_path
|
||||
|
||||
def __load_dir(self, path: Path) -> None:
|
||||
self.__folders = self.__detect_folders(path)
|
||||
self.__appended.clear()
|
||||
self.__update_combined()
|
||||
|
||||
def __detect_folders(self, path: Path) -> List[str]:
|
||||
if path.exists() and path.is_dir():
|
||||
# Iterate through directory and return all pathes that are dirs, only return their name.
|
||||
return sorted(
|
||||
[str(x.name) for x in path.iterdir() if x.is_dir()], reverse=True
|
||||
)
|
||||
else:
|
||||
return []
|
||||
|
||||
def append_item(self, item: str) -> None:
|
||||
self.__appended.append(item)
|
||||
self.__update_combined()
|
||||
|
||||
def __update_combined(self) -> None:
|
||||
self.__combined.clear()
|
||||
self.__combined.extend(
|
||||
sorted(list(set(self.__folders + self.__appended)), reverse=True)
|
||||
)
|
||||
|
||||
@property
|
||||
def items(self) -> List[str]:
|
||||
return self.__combined
|
||||
|
||||
@property
|
||||
def items_as_enum_list(self) -> List[Tuple[str, str, str]]:
|
||||
return [(item, item, "") for item in self.__combined]
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,953 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
import re
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import List, Tuple, Generator, Dict, Union, Any, Optional
|
||||
|
||||
import bpy
|
||||
|
||||
from . import cmglobals
|
||||
from .logger import LoggerFactory, log_new_lines
|
||||
from .models import FolderListModel
|
||||
|
||||
logger = LoggerFactory.getLogger(__name__)
|
||||
|
||||
VERSION_DIR_MODEL = FolderListModel()
|
||||
|
||||
_cachefiles_enum_list: List[Tuple[str, str, str]] = []
|
||||
_versions_enum_list: List[Tuple[str, str, str]] = []
|
||||
_version_dir_model_init: bool = False
|
||||
|
||||
|
||||
def init_version_dir_model(
|
||||
context: bpy.types.Context,
|
||||
) -> None:
|
||||
|
||||
global VERSION_DIR_MODEL
|
||||
global _version_dir_model_init
|
||||
|
||||
# Is None if invalid.
|
||||
if not context.scene.cm.cache_version_dir_path:
|
||||
logger.error(
|
||||
"Failed to initialize version directory model. Invalid path. Check addon preferences."
|
||||
)
|
||||
return
|
||||
|
||||
cache_version_dir = Path(context.scene.cm.cache_version_dir_path)
|
||||
|
||||
VERSION_DIR_MODEL.reset()
|
||||
VERSION_DIR_MODEL.root_path = cache_version_dir
|
||||
|
||||
if context.scene.cm.category == "EXPORT":
|
||||
if not VERSION_DIR_MODEL.items:
|
||||
VERSION_DIR_MODEL.append_item("v001")
|
||||
|
||||
_version_dir_model_init = True
|
||||
|
||||
|
||||
def get_version(str_value: str, format: type = str) -> Union[str, int, None]:
|
||||
match = re.search(cmglobals._VERSION_PATTERN, str_value)
|
||||
if match:
|
||||
version = match.group()
|
||||
if format == str:
|
||||
return version
|
||||
if format == int:
|
||||
return int(version.replace("v", ""))
|
||||
return None
|
||||
|
||||
|
||||
def add_version_increment() -> str:
|
||||
items = VERSION_DIR_MODEL.items # should be already sorted
|
||||
|
||||
versions = [get_version(item) for item in items if get_version(item)]
|
||||
|
||||
if len(versions) > 0:
|
||||
latest_version = items[0]
|
||||
increment = "v{:03}".format(int(latest_version.replace("v", "")) + 1)
|
||||
else:
|
||||
increment = "v001"
|
||||
|
||||
VERSION_DIR_MODEL.append_item(increment)
|
||||
return increment
|
||||
|
||||
|
||||
def get_versions_enum_list(
|
||||
self: Any,
|
||||
context: bpy.types.Context,
|
||||
) -> List[Tuple[str, str, str]]:
|
||||
|
||||
global _versions_enum_list
|
||||
global VERSION_DIR_MODEL
|
||||
global init_version_dir_model
|
||||
|
||||
# Init model if it did not happen.
|
||||
if not _version_dir_model_init:
|
||||
init_version_dir_model(context)
|
||||
|
||||
# Clear all versions in enum list.
|
||||
_versions_enum_list.clear()
|
||||
_versions_enum_list.extend(VERSION_DIR_MODEL.items_as_enum_list)
|
||||
|
||||
return _versions_enum_list
|
||||
|
||||
|
||||
def add_version_custom(custom_version: str) -> None:
|
||||
global _versions_enum_list
|
||||
global VERSION_DIR_MODEL
|
||||
|
||||
VERSION_DIR_MODEL.append_item(custom_version)
|
||||
|
||||
|
||||
def _get_cachefiles(cachedir_path: Path, file_ext: str = ".abc") -> List[Path]:
|
||||
if file_ext == ".*":
|
||||
return [Path(f) for f in cachedir_path.iterdir() if f.is_file()]
|
||||
else:
|
||||
return [
|
||||
Path(f)
|
||||
for f in cachedir_path.iterdir()
|
||||
if f.is_file() and f.suffix == file_ext
|
||||
]
|
||||
|
||||
|
||||
def get_cachefiles_enum(
|
||||
self: bpy.types.Operator, context: bpy.types.Context
|
||||
) -> List[Tuple[str, str, str]]:
|
||||
|
||||
_cachefiles_enum_list.clear()
|
||||
|
||||
if not context.scene.cm.is_cachedir_valid:
|
||||
return _cachefiles_enum_list
|
||||
|
||||
_cachefiles_enum_list.extend(
|
||||
[
|
||||
(path.as_posix(), path.name, "")
|
||||
for path in _get_cachefiles(context.scene.cm.cachedir_path)
|
||||
]
|
||||
)
|
||||
|
||||
return _cachefiles_enum_list
|
||||
|
||||
|
||||
def traverse_collection_tree(
|
||||
collection: bpy.types.Collection,
|
||||
) -> Generator[bpy.types.Collection, None, None]:
|
||||
yield collection
|
||||
for child in collection.children:
|
||||
yield from traverse_collection_tree(child)
|
||||
|
||||
|
||||
def _print_log_list(log_list: Dict[str, List[str]], header_str: str) -> None:
|
||||
if log_list:
|
||||
log_new_lines(1)
|
||||
text = [f"{obj_name}:\n{''.join(log_list[obj_name])}" for obj_name in log_list]
|
||||
logger.info("%s\n%s", header_str, "".join(text))
|
||||
|
||||
|
||||
def _append_str_to_log_list(
|
||||
log_list: Dict[str, List[str]], obj_name: str, str_value: str
|
||||
) -> Dict[str, List[str]]:
|
||||
log_list.setdefault(obj_name, [])
|
||||
log_list[obj_name].append(f"{str_value},\n")
|
||||
return log_list
|
||||
|
||||
|
||||
def disable_vis_drivers(
|
||||
objects: List[bpy.types.Object], modifiers: bool = True
|
||||
) -> List[bpy.types.Driver]:
|
||||
|
||||
# Store driver that were muted to entmute them after.
|
||||
muted_drivers: List[bpy.types.Driver] = []
|
||||
|
||||
# Log list.
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for obj in objects:
|
||||
if obj.animation_data:
|
||||
for driver in obj.animation_data.drivers:
|
||||
|
||||
# Get suffix of data path, if modifiers modifier name is at the beginning.
|
||||
data_path_split = driver.data_path.split(".")
|
||||
data_path_suffix = data_path_split[-1]
|
||||
|
||||
# If modifiers == False do not adjust drivers of which the data
|
||||
# paths are starting with modifiers.
|
||||
if not modifiers:
|
||||
if len(data_path_split) > 1:
|
||||
if data_path_split[0].startswith("modifiers"):
|
||||
continue
|
||||
|
||||
# Only disable drivers that drive visibility data paths.
|
||||
if data_path_suffix not in cmglobals.DRIVER_VIS_DATA_PATHS:
|
||||
continue
|
||||
|
||||
# If muted already continue.
|
||||
if driver.mute == True:
|
||||
continue
|
||||
|
||||
# Mute.
|
||||
driver.mute = True
|
||||
muted_drivers.append(driver)
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(log_list, obj.name, driver.data_path)
|
||||
# Log.
|
||||
_print_log_list(log_list, "Disable visibility drivers:")
|
||||
return muted_drivers
|
||||
|
||||
|
||||
def disable_drivers_by_data_path(
|
||||
objects: List[bpy.types.Object], data_path: str
|
||||
) -> List[bpy.types.Driver]:
|
||||
|
||||
# Store driver that were muted to entmute them after.
|
||||
muted_drivers: List[bpy.types.Driver] = []
|
||||
|
||||
# Log list.
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for obj in objects:
|
||||
if obj.animation_data:
|
||||
for driver in obj.animation_data.drivers:
|
||||
|
||||
if driver.data_path != data_path:
|
||||
continue
|
||||
|
||||
# Skip if driver already muted.
|
||||
if driver.mute == True:
|
||||
continue
|
||||
|
||||
# Mute.
|
||||
driver.mute = True
|
||||
muted_drivers.append(driver)
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(log_list, obj.name, driver.data_path)
|
||||
|
||||
return muted_drivers
|
||||
|
||||
|
||||
def sync_modifier_vis_with_render_setting(
|
||||
objs: List[bpy.types.Object],
|
||||
) -> List[Tuple[bpy.types.Modifier, bool, bool]]:
|
||||
|
||||
mods_vis_override: List[Tuple[bpy.types.Modifier, bool, bool]] = []
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for obj in objs:
|
||||
|
||||
for mod in obj.modifiers:
|
||||
|
||||
# Do not affect those for export.
|
||||
if mod.type in cmglobals.MODIFIERS_KEEP:
|
||||
continue
|
||||
|
||||
# If already synced continue.
|
||||
if mod.show_viewport == mod.show_render:
|
||||
continue
|
||||
|
||||
# Save cache for reconstruction later.
|
||||
show_viewport_cache = mod.show_viewport
|
||||
show_render_cache = mod.show_render
|
||||
|
||||
# Sync show_viewport with show_render setting.
|
||||
mod.show_viewport = mod.show_render
|
||||
mods_vis_override.append((mod, show_viewport_cache, show_render_cache))
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(
|
||||
log_list,
|
||||
obj.name,
|
||||
f"{mod.name}: V: {show_viewport_cache} -> {mod.show_viewport}",
|
||||
)
|
||||
# Log.
|
||||
_print_log_list(log_list, "Sync modifier viewport vis with render vis:")
|
||||
|
||||
return mods_vis_override
|
||||
|
||||
|
||||
def apply_modifier_suffix_vis_override(
|
||||
objs: List[bpy.types.Object], category: str
|
||||
) -> List[Tuple[bpy.types.Modifier, bool, bool]]:
|
||||
|
||||
mods_vis_override: List[Tuple[bpy.types.Modifier, bool, bool]] = []
|
||||
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for obj in objs:
|
||||
|
||||
for mod in list(obj.modifiers):
|
||||
|
||||
show_viewport_cache = mod.show_viewport
|
||||
show_render_cache = mod.show_render
|
||||
|
||||
if category == "EXPORT":
|
||||
|
||||
if mod.name.endswith(cmglobals.CACHE_OFF_SUFFIX):
|
||||
if mod.show_viewport == False and mod.show_render == False:
|
||||
continue
|
||||
mod.show_viewport = False
|
||||
mod.show_render = False
|
||||
|
||||
elif mod.name.endswith(cmglobals.CACHE_ON_SUFFIX):
|
||||
if mod.show_viewport == True and mod.show_render == True:
|
||||
continue
|
||||
mod.show_viewport = True
|
||||
mod.show_render = True
|
||||
|
||||
else:
|
||||
continue
|
||||
|
||||
if category == "IMPORT":
|
||||
|
||||
if mod.name.endswith(cmglobals.CACHE_OFF_SUFFIX):
|
||||
if mod.show_viewport == True and mod.show_render == True:
|
||||
continue
|
||||
mod.show_viewport = True
|
||||
mod.show_render = True
|
||||
|
||||
elif mod.name.endswith(cmglobals.CACHE_ON_SUFFIX):
|
||||
if mod.show_viewport == False and mod.show_render == False:
|
||||
continue
|
||||
mod.show_viewport = False
|
||||
mod.show_render = False
|
||||
|
||||
else:
|
||||
continue
|
||||
|
||||
mods_vis_override.append((mod, show_viewport_cache, show_render_cache))
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(
|
||||
log_list,
|
||||
obj.name,
|
||||
f"{mod.name}: V: {show_viewport_cache} -> {mod.show_viewport} R: {show_render_cache} -> {mod.show_render}",
|
||||
)
|
||||
# Log.
|
||||
_print_log_list(log_list, "Apply modifier suffix vis override:")
|
||||
|
||||
return mods_vis_override
|
||||
|
||||
|
||||
def restore_modifier_vis(
|
||||
modifiers: List[Tuple[bpy.types.Modifier, bool, bool]]
|
||||
) -> None:
|
||||
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for mod, show_viewport, show_render in modifiers:
|
||||
|
||||
if mod.show_viewport == show_viewport and mod.show_render == show_render:
|
||||
continue
|
||||
|
||||
show_viewport_cache = mod.show_viewport
|
||||
show_render_cache = mod.show_render
|
||||
|
||||
mod.show_viewport = show_viewport
|
||||
mod.show_render = show_render
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(
|
||||
log_list,
|
||||
mod.id_data.name,
|
||||
f"{mod.name}: V: {show_viewport_cache} -> {mod.show_viewport} R: {show_render_cache} -> {mod.show_render}",
|
||||
)
|
||||
|
||||
# Log.
|
||||
_print_log_list(log_list, "Restore modifier visiblity:")
|
||||
|
||||
|
||||
def config_modifiers_keep_state(
|
||||
objs: List[bpy.types.Object],
|
||||
enable: bool = True,
|
||||
) -> List[Tuple[bpy.types.Modifier, bool, bool]]:
|
||||
|
||||
mods_vis_override: List[Tuple[bpy.types.Modifier, bool, bool]] = []
|
||||
|
||||
noun = "Enabled" if enable else "Disabled"
|
||||
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for obj in objs:
|
||||
|
||||
for mod in list(obj.modifiers):
|
||||
|
||||
if mod.type not in cmglobals.MODIFIERS_KEEP:
|
||||
continue
|
||||
|
||||
show_viewport_cache = mod.show_viewport
|
||||
show_render_cache = mod.show_render
|
||||
|
||||
if enable:
|
||||
if mod.show_viewport == True and mod.show_render == True:
|
||||
continue
|
||||
# Do not change viewport setting on enable, might create overhead
|
||||
# for mods that are only needed for render.
|
||||
mod.show_render = True
|
||||
|
||||
else:
|
||||
if mod.show_viewport == False and mod.show_render == False:
|
||||
continue
|
||||
mod.show_viewport = False
|
||||
mod.show_render = False
|
||||
|
||||
mods_vis_override.append((mod, show_viewport_cache, show_render_cache))
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(
|
||||
log_list,
|
||||
obj.name,
|
||||
mod.name,
|
||||
)
|
||||
# Log.
|
||||
_print_log_list(log_list, f"{noun} modifiers:")
|
||||
|
||||
return mods_vis_override
|
||||
|
||||
|
||||
def set_item_vis(
|
||||
items: List[Union[bpy.types.Object, bpy.types.Collection]],
|
||||
show: bool,
|
||||
) -> List[Tuple[Union[bpy.types.Object, bpy.types.Collection], bool, bool]]:
|
||||
|
||||
items_vis: List[
|
||||
Tuple[Union[bpy.types.Object, bpy.types.Collection], bool, bool]
|
||||
] = []
|
||||
hide = not show
|
||||
noun = "Hide" if hide else "Show"
|
||||
|
||||
for item in items:
|
||||
|
||||
hide_viewport_cache = item.hide_viewport
|
||||
hide_render_cache = item.hide_render
|
||||
|
||||
if hide:
|
||||
if item.hide_viewport and item.hide_render:
|
||||
continue
|
||||
|
||||
item.hide_render = True
|
||||
item.hide_viewport = True
|
||||
|
||||
else:
|
||||
if not item.hide_viewport and not item.hide_render:
|
||||
continue
|
||||
|
||||
item.hide_render = False
|
||||
item.hide_viewport = False
|
||||
|
||||
items_vis.append((item, hide_viewport_cache, hide_render_cache))
|
||||
|
||||
if items_vis:
|
||||
# Log.
|
||||
logger.info(
|
||||
"%s:\n%s",
|
||||
noun,
|
||||
",\n".join([item.name for item, hv, hr in items_vis]),
|
||||
)
|
||||
|
||||
return items_vis
|
||||
|
||||
|
||||
def restore_item_vis(
|
||||
item_vis_list: List[
|
||||
Tuple[Union[bpy.types.Object, bpy.types.Collection], bool, bool]
|
||||
]
|
||||
) -> None:
|
||||
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for item, hide_viewport, hide_render in item_vis_list:
|
||||
|
||||
if item.hide_viewport == hide_viewport and item.hide_render == hide_render:
|
||||
continue
|
||||
|
||||
hide_viewport_cache = item.hide_viewport
|
||||
hide_render_cache = item.hide_render
|
||||
|
||||
item.hide_viewport = hide_viewport
|
||||
item.hide_render = hide_render
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(
|
||||
log_list,
|
||||
item.name,
|
||||
f"V: {not hide_viewport_cache} -> {not item.hide_viewport} R: {not hide_render_cache} -> {not item.hide_render}",
|
||||
)
|
||||
|
||||
# Log.
|
||||
_print_log_list(log_list, "Restore visibility:")
|
||||
|
||||
|
||||
def get_layer_colls_from_colls(
|
||||
context: bpy.types.Context, collections: List[bpy.types.Collection]
|
||||
) -> List[bpy.types.LayerCollection]:
|
||||
|
||||
layer_colls: List[bpy.types.LayerCollection] = []
|
||||
coll_name: List[str] = [coll.name for coll in collections]
|
||||
|
||||
for lcoll in list(traverse_collection_tree(context.view_layer.layer_collection)):
|
||||
if lcoll.name in coll_name:
|
||||
layer_colls.append(lcoll)
|
||||
|
||||
return layer_colls
|
||||
|
||||
|
||||
def set_layer_coll_exlcude(
|
||||
layer_collections: List[bpy.types.LayerCollection], exclude: bool
|
||||
) -> List[Tuple[bpy.types.LayerCollection, bool]]:
|
||||
|
||||
layer_colls_vis: List[Tuple[bpy.types.LayerCollection, bool]] = []
|
||||
|
||||
noun = "Exclude" if exclude else "Include"
|
||||
|
||||
for lcoll in layer_collections:
|
||||
|
||||
exclude_cache = lcoll.exclude
|
||||
|
||||
if exclude:
|
||||
if lcoll.exclude and lcoll.hide_render:
|
||||
continue
|
||||
|
||||
lcoll.exclude = True
|
||||
|
||||
else:
|
||||
if not lcoll.exclude:
|
||||
continue
|
||||
|
||||
lcoll.exclude = False
|
||||
|
||||
layer_colls_vis.append((lcoll, exclude_cache))
|
||||
|
||||
if layer_colls_vis:
|
||||
# Log.
|
||||
logger.info(
|
||||
"%s layer collections in current view layer:\n%s",
|
||||
noun,
|
||||
",\n".join([lcoll.name for lcoll, ex in layer_colls_vis]),
|
||||
)
|
||||
|
||||
return layer_colls_vis
|
||||
|
||||
|
||||
def restore_layer_coll_exlude(
|
||||
lcoll_vis_list: List[Tuple[bpy.types.LayerCollection, bool]]
|
||||
) -> None:
|
||||
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for lcoll, exclude in lcoll_vis_list:
|
||||
|
||||
if lcoll.exclude == exclude:
|
||||
continue
|
||||
|
||||
exclude_cache = lcoll.exclude
|
||||
|
||||
lcoll.exclude = exclude
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(
|
||||
log_list,
|
||||
lcoll.name,
|
||||
f"exclude: {exclude_cache} -> {lcoll.exclude}",
|
||||
)
|
||||
|
||||
# Log.
|
||||
_print_log_list(log_list, "Restore layer collection visibility:")
|
||||
|
||||
|
||||
def enable_muted_drivers(
|
||||
muted_drivers: List[bpy.types.Driver],
|
||||
) -> List[bpy.types.Driver]:
|
||||
|
||||
# Log list.
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for driver in muted_drivers:
|
||||
|
||||
if driver.mute == False:
|
||||
continue
|
||||
|
||||
driver.mute = False
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(log_list, driver.id_data.name, driver.data_path)
|
||||
|
||||
# Log.
|
||||
_print_log_list(log_list, "Enable drivers:")
|
||||
|
||||
return muted_drivers
|
||||
|
||||
|
||||
def gen_abc_object_path(obj: bpy.types.Object) -> str:
|
||||
# If object is duplicated (multiple copies of the same object that get different cachses)
|
||||
# we have to kill the .001 postfix that gets created auto on duplication
|
||||
# otherwise object path is not valid.
|
||||
|
||||
object_name = obj.name
|
||||
object_path = "/" + object_name
|
||||
|
||||
if obj.data and obj.type != "LATTICE":
|
||||
object_data_name = obj.data.name
|
||||
object_path = "/" + object_name + "/" + object_data_name
|
||||
|
||||
# Dot and whitespace not valid in abc tree will be replaced with underscore.
|
||||
replace = [" ", "."]
|
||||
for char in replace:
|
||||
object_path = object_path.replace(char, "_")
|
||||
|
||||
return str(object_path)
|
||||
|
||||
|
||||
def disable_non_keep_modifiers(obj: bpy.types.Object) -> int:
|
||||
modifiers = list(obj.modifiers)
|
||||
a_index: int = -1
|
||||
disabled_mods = []
|
||||
for idx, mod in enumerate(modifiers):
|
||||
if mod.type not in cmglobals.MODIFIERS_KEEP:
|
||||
# Save index of first armature modifier to.
|
||||
if a_index == -1 and mod.type == "ARMATURE":
|
||||
a_index = idx
|
||||
|
||||
if not mod.show_viewport and not mod.show_render:
|
||||
continue
|
||||
|
||||
mod.show_viewport = False
|
||||
mod.show_render = False
|
||||
mod.show_in_editmode = False
|
||||
disabled_mods.append(mod.name)
|
||||
|
||||
if disabled_mods:
|
||||
logger.info("%s Disabled modifiers: %s", obj.name, ", ".join(disabled_mods))
|
||||
|
||||
return a_index
|
||||
|
||||
|
||||
def disable_non_keep_constraints(obj: bpy.types.Object) -> List[bpy.types.Constraint]:
|
||||
constraints = list(obj.constraints)
|
||||
disabled_const: List[bpy.types.Constraint] = []
|
||||
|
||||
for c in constraints:
|
||||
if c.type in cmglobals.CONSTRAINTS_KEEP:
|
||||
continue
|
||||
|
||||
if c.mute:
|
||||
continue
|
||||
|
||||
c.mute = True
|
||||
disabled_const.append(c)
|
||||
|
||||
if disabled_const:
|
||||
logger.info(
|
||||
"%s Disabled constaints: %s",
|
||||
obj.name,
|
||||
", ".join([c.name for c in disabled_const]),
|
||||
)
|
||||
return disabled_const
|
||||
|
||||
|
||||
def ensure_cachefile(cachefile_path: str) -> bpy.types.CacheFile:
|
||||
# Get cachefile path for this collection.
|
||||
cachefile_name = Path(cachefile_path).name
|
||||
|
||||
# Import Alembic Cache. if its already imported reload it.
|
||||
try:
|
||||
bpy.data.cache_files[cachefile_name]
|
||||
except KeyError:
|
||||
bpy.ops.cachefile.open(filepath=cachefile_path)
|
||||
logger.info("Imported cachefile: %s", cachefile_path)
|
||||
else:
|
||||
bpy.ops.cachefile.reload()
|
||||
|
||||
cachefile = bpy.data.cache_files[cachefile_name]
|
||||
cachefile.scale = 1
|
||||
return cachefile
|
||||
|
||||
|
||||
def ensure_cache_modifier(obj: bpy.types.Object) -> bpy.types.MeshSequenceCacheModifier:
|
||||
modifier_name = cmglobals.MODIFIER_NAME
|
||||
|
||||
# If modifier does not exist yet create it.
|
||||
if obj.modifiers.find(modifier_name) == -1: # not found
|
||||
mod = obj.modifiers.new(modifier_name, "MESH_SEQUENCE_CACHE")
|
||||
logger.info(
|
||||
"%s added %s modifier.",
|
||||
obj.name,
|
||||
modifier_name,
|
||||
)
|
||||
mod = obj.modifiers.get(modifier_name)
|
||||
return mod
|
||||
|
||||
|
||||
def ensure_cache_constraint(
|
||||
obj: bpy.types.Object,
|
||||
) -> bpy.types.TransformCacheConstraint:
|
||||
constraint_name = cmglobals.CONSTRAINT_NAME
|
||||
# If constraint does not exist yet create it.
|
||||
if obj.constraints.find(constraint_name) == -1: # not found
|
||||
con = obj.constraints.new("TRANSFORM_CACHE")
|
||||
con.name = constraint_name
|
||||
logger.info(
|
||||
"%s added %s constraint.",
|
||||
obj.name,
|
||||
constraint_name,
|
||||
)
|
||||
con = obj.constraints.get(constraint_name)
|
||||
return con
|
||||
|
||||
|
||||
def kill_increment(str_value: str) -> str:
|
||||
match = re.search(r"\.\d\d\d", str_value)
|
||||
if match:
|
||||
return str_value.replace(match.group(0), "")
|
||||
return str_value
|
||||
|
||||
|
||||
def config_cache_modifier(
|
||||
context: bpy.types.Context,
|
||||
mod: bpy.types.MeshSequenceCacheModifier,
|
||||
modifier_index: int,
|
||||
cachefile: bpy.types.CacheFile,
|
||||
abc_obj_path: str,
|
||||
) -> bpy.types.MeshSequenceCacheModifier:
|
||||
obj = mod.id_data
|
||||
# Move to index
|
||||
# as we need to use bpy.ops for that object needs to be active.
|
||||
bpy.context.view_layer.objects.active = obj
|
||||
override = context.copy()
|
||||
override["modifier"] = mod
|
||||
bpy.ops.object.modifier_move_to_index(
|
||||
override, modifier=mod.name, index=modifier_index
|
||||
)
|
||||
# Adjust settings.
|
||||
mod.cache_file = cachefile
|
||||
mod.object_path = abc_obj_path
|
||||
|
||||
return mod
|
||||
|
||||
|
||||
def config_cache_constraint(
|
||||
context: bpy.types.Context,
|
||||
con: bpy.types.TransformCacheConstraint,
|
||||
cachefile: bpy.types.CacheFile,
|
||||
abc_obj_path: str,
|
||||
) -> bpy.types.TransformCacheConstraint:
|
||||
obj = con.id_data
|
||||
|
||||
# Move to index.
|
||||
current_index = obj.constraints.find(con.name)
|
||||
obj.constraints.move(current_index, 0)
|
||||
|
||||
# Adjust settings.
|
||||
con.cache_file = cachefile
|
||||
con.object_path = abc_obj_path
|
||||
|
||||
return con
|
||||
|
||||
|
||||
def add_coll_to_cache_collections(
|
||||
context: bpy.types.Context, coll: bpy.types.Collection, category: str
|
||||
) -> Optional[bpy.types.Collection]:
|
||||
|
||||
scn = context.scene
|
||||
|
||||
scn_category = scn.cm.colls_export
|
||||
idx = scn.cm.colls_export_index
|
||||
|
||||
if category == "IMPORT":
|
||||
scn_category = scn.cm.colls_import
|
||||
idx = scn.cm.colls_import_index
|
||||
|
||||
if coll in [c[1].coll_ptr for c in scn_category.items()]:
|
||||
logger.info(
|
||||
"%s already in the %s cache collections list", coll.name, category.lower()
|
||||
)
|
||||
# Set is_cache_coll.
|
||||
coll.cm.is_cache_coll = True
|
||||
|
||||
return None
|
||||
else:
|
||||
if category == "EXPORT" and not coll.override_library and not coll.library:
|
||||
# Local collection
|
||||
# blend file needs to be saved for that.
|
||||
if not bpy.data.filepath:
|
||||
logger.error(
|
||||
"Failed to add local collection %s to export list. Blend files needs to be saved.",
|
||||
coll.name,
|
||||
)
|
||||
return None
|
||||
|
||||
item = scn_category.add()
|
||||
item.coll_ptr = coll
|
||||
item.name = item.coll_ptr.name
|
||||
idx = len(scn_category) - 1
|
||||
|
||||
# Set is_cache_coll.
|
||||
coll.cm.is_cache_coll = True
|
||||
|
||||
logger.info(
|
||||
"%s added to %s cache collections list", item.name, category.lower()
|
||||
)
|
||||
|
||||
return coll
|
||||
|
||||
|
||||
def rm_coll_from_cache_collections(
|
||||
context: bpy.types.Context, category: str
|
||||
) -> Optional[bpy.types.Collection]:
|
||||
|
||||
scn = context.scene
|
||||
|
||||
scn_category = scn.cm.colls_export
|
||||
idx = scn.cm.colls_export_index
|
||||
|
||||
if category == "IMPORT":
|
||||
scn_category = scn.cm.colls_import
|
||||
idx = scn.cm.colls_import_index
|
||||
|
||||
try:
|
||||
item = scn_category[idx]
|
||||
except IndexError:
|
||||
return None
|
||||
else:
|
||||
|
||||
item = scn_category[idx]
|
||||
item_name = item.name
|
||||
scn_category.remove(idx)
|
||||
idx -= 1
|
||||
|
||||
# Reset coll.cm properties.
|
||||
coll = item.coll_ptr
|
||||
if coll: # check if not None (coll might be deleted)
|
||||
coll.cm.reset_properties()
|
||||
|
||||
logger.info(
|
||||
"Removed %s from %s cache collections list", item_name, category.lower()
|
||||
)
|
||||
return coll
|
||||
|
||||
|
||||
def get_cache_frame_range(context: bpy.types.Context) -> Tuple[int, int]:
|
||||
frame_in = context.scene.frame_start - context.scene.cm.frame_handles_left
|
||||
if frame_in < 0:
|
||||
frame_in = 0
|
||||
frame_end = context.scene.frame_end + context.scene.cm.frame_handles_right
|
||||
|
||||
return (frame_in, frame_end)
|
||||
|
||||
|
||||
def set_instancing_type_of_empties(
|
||||
object_list: List[bpy.types.Object], instance_type: str
|
||||
) -> List[Tuple[bpy.types.Object, str]]:
|
||||
if instance_type not in cmglobals.INSTANCE_TYPES:
|
||||
raise ValueError(f"Invalid instance type: {instance_type}")
|
||||
|
||||
empties_to_restore: List[Tuple[bpy.types.Object, str]] = []
|
||||
|
||||
for obj in object_list:
|
||||
if not obj.type == "EMPTY":
|
||||
continue
|
||||
|
||||
if obj.instance_type == instance_type:
|
||||
continue
|
||||
|
||||
instance_type_cache = obj.instance_type
|
||||
|
||||
obj.instance_type = instance_type
|
||||
|
||||
empties_to_restore.append((obj, instance_type_cache))
|
||||
|
||||
if empties_to_restore:
|
||||
logger.info(
|
||||
"Set instance type to %s for empties: %s\n",
|
||||
instance_type,
|
||||
" ,".join([obj.name for obj, it in empties_to_restore]),
|
||||
)
|
||||
|
||||
return empties_to_restore
|
||||
|
||||
|
||||
def restore_instancing_type(restore_list: List[Tuple[bpy.types.Object, str]]) -> None:
|
||||
|
||||
# Log list.
|
||||
log_list: Dict[str, List[str]] = {}
|
||||
|
||||
for obj, instance_type in restore_list:
|
||||
|
||||
if obj.instance_type == instance_type:
|
||||
continue
|
||||
|
||||
instance_type_cache = obj.instance_type
|
||||
obj.instance_type = instance_type
|
||||
|
||||
# Populate log list.
|
||||
_append_str_to_log_list(
|
||||
log_list,
|
||||
obj.name,
|
||||
f"{instance_type_cache}: -> {obj.instance_type}",
|
||||
)
|
||||
|
||||
# Log.
|
||||
_print_log_list(log_list, "Restore instance types:")
|
||||
|
||||
|
||||
def is_item_local(
|
||||
item: Union[bpy.types.Collection, bpy.types.Object, bpy.types.Camera]
|
||||
) -> bool:
|
||||
# Local collection of blend file.
|
||||
if not item.override_library and not item.library:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def is_item_lib_override(
|
||||
item: Union[bpy.types.Collection, bpy.types.Object, bpy.types.Camera]
|
||||
) -> bool:
|
||||
# Collection from libfile and overwritten.
|
||||
if item.override_library and not item.library:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def is_item_lib_source(
|
||||
item: Union[bpy.types.Collection, bpy.types.Object, bpy.types.Camera]
|
||||
) -> bool:
|
||||
# Source collection from libfile not overwritten.
|
||||
if not item.override_library and item.library:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def get_item_libfile(
|
||||
item: Union[bpy.types.Collection, bpy.types.Object, bpy.types.Camera]
|
||||
) -> str:
|
||||
if is_item_lib_source(item):
|
||||
# Source collection not overwritten.
|
||||
lib = item.library
|
||||
return Path(os.path.abspath(bpy.path.abspath(lib.filepath))).as_posix()
|
||||
|
||||
if is_item_local(item):
|
||||
# Local collection
|
||||
# blend file needs to be saved for that.
|
||||
if not bpy.data.filepath:
|
||||
return ""
|
||||
return Path(os.path.abspath(bpy.path.abspath(bpy.data.filepath))).as_posix()
|
||||
|
||||
if is_item_lib_override(item):
|
||||
# Overwritten collection.
|
||||
lib = item.override_library.reference.library
|
||||
return Path(os.path.abspath(bpy.path.abspath(lib.filepath))).as_posix()
|
||||
|
||||
return ""
|
||||
|
||||
|
||||
def set_simplify(use_simplify: bool) -> None:
|
||||
if bpy.context.scene.render.use_simplify == use_simplify:
|
||||
return
|
||||
|
||||
noun = "Enabled"
|
||||
if not use_simplify:
|
||||
noun = "Disabled"
|
||||
bpy.context.scene.render.use_simplify = use_simplify
|
||||
logger.info("%s simplify", noun)
|
||||
@@ -0,0 +1,108 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Optional, Any
|
||||
|
||||
import bpy
|
||||
from bpy.app.handlers import persistent
|
||||
|
||||
from . import opsdata
|
||||
|
||||
|
||||
def update_cache_version_property(context: bpy.types.Context) -> None:
|
||||
items = opsdata.VERSION_DIR_MODEL.items
|
||||
if not items:
|
||||
context.scene.cm.cache_version = ""
|
||||
else:
|
||||
context.scene.cm.cache_version = items[0]
|
||||
|
||||
|
||||
def category_update_version_model(self: Any, context: bpy.types.Context) -> None:
|
||||
opsdata.init_version_dir_model(context)
|
||||
update_cache_version_property(context)
|
||||
|
||||
|
||||
def addon_prefs_get(context: bpy.types.Context=None) -> bpy.types.AddonPreferences:
|
||||
"""
|
||||
shortcut to get cache_manager addon preferences
|
||||
"""
|
||||
if not context:
|
||||
context = bpy.context
|
||||
from . import __package__ as base_package
|
||||
if base_package.startswith('bl_ext'):
|
||||
# 4.2
|
||||
return context.preferences.addons[base_package].preferences
|
||||
else:
|
||||
return context.preferences.addons[base_package.split(".")[0]].preferences
|
||||
|
||||
|
||||
class CM_AddonPreferences(bpy.types.AddonPreferences):
|
||||
bl_idname = __package__
|
||||
|
||||
cachedir_root: bpy.props.StringProperty( # type: ignore
|
||||
name="cache dir",
|
||||
default="//cache",
|
||||
options={"HIDDEN", "SKIP_SAVE"},
|
||||
subtype="DIR_PATH",
|
||||
description="Root directory in which the caches will be exported. Will create subfolders during export",
|
||||
update=category_update_version_model,
|
||||
)
|
||||
|
||||
def draw(self, context: bpy.types.Context) -> None:
|
||||
layout = self.layout
|
||||
box = layout.box()
|
||||
box.row().prop(self, "cachedir_root", text="Root Cache Directory")
|
||||
|
||||
if not self.cachedir_root:
|
||||
row = box.row()
|
||||
row.label(text="Please specify the root cache directory.", icon="ERROR")
|
||||
|
||||
if not bpy.data.filepath and self.cachedir_root.startswith("//"):
|
||||
row = box.row()
|
||||
row.label(
|
||||
text="In order to use a relative path as root cache directory the current file needs to be saved.",
|
||||
icon="ERROR",
|
||||
)
|
||||
|
||||
@property
|
||||
def cachedir_root_path(self) -> Optional[Path]:
|
||||
if not self.is_cachedir_root_valid:
|
||||
return None
|
||||
return Path(os.path.abspath(bpy.path.abspath(self.cachedir_root)))
|
||||
|
||||
@property
|
||||
def is_cachedir_root_valid(self) -> bool:
|
||||
|
||||
# Check if file is saved.
|
||||
if not self.cachedir_root:
|
||||
return False
|
||||
|
||||
if not bpy.data.filepath and self.cachedir_root.startswith("//"):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
# ---------REGISTER ----------.
|
||||
|
||||
@persistent
|
||||
def load_post_handler_init_model_cache_version(dummy: Any) -> None:
|
||||
category_update_version_model(None, bpy.context)
|
||||
|
||||
|
||||
classes = [CM_AddonPreferences]
|
||||
|
||||
|
||||
def register():
|
||||
bpy.app.handlers.load_post.append(load_post_handler_init_model_cache_version)
|
||||
for cls in classes:
|
||||
bpy.utils.register_class(cls)
|
||||
|
||||
|
||||
def unregister():
|
||||
bpy.app.handlers.load_post.remove(load_post_handler_init_model_cache_version)
|
||||
for cls in reversed(classes):
|
||||
bpy.utils.unregister_class(cls)
|
||||
@@ -0,0 +1,256 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
import os
|
||||
|
||||
from typing import List, Any, Generator, Optional
|
||||
from pathlib import Path
|
||||
|
||||
import bpy
|
||||
|
||||
from . import prefs, propsdata
|
||||
|
||||
|
||||
class CM_collection_property(bpy.types.PropertyGroup):
|
||||
coll_ptr: bpy.props.PointerProperty(name="Collection", type=bpy.types.Collection)
|
||||
|
||||
|
||||
class CM_property_group_collection(bpy.types.PropertyGroup):
|
||||
|
||||
is_cache_coll: bpy.props.BoolProperty(
|
||||
name="Cache Collection",
|
||||
default=False,
|
||||
options={"LIBRARY_EDITABLE"},
|
||||
override={"LIBRARY_OVERRIDABLE"},
|
||||
)
|
||||
|
||||
cachefile: bpy.props.StringProperty(
|
||||
name="Cachefile",
|
||||
default="",
|
||||
subtype="FILE_PATH",
|
||||
options={"LIBRARY_EDITABLE"},
|
||||
override={"LIBRARY_OVERRIDABLE"},
|
||||
)
|
||||
is_cache_loaded: bpy.props.BoolProperty(
|
||||
name="Cache Loaded",
|
||||
default=False,
|
||||
options={"LIBRARY_EDITABLE"},
|
||||
override={"LIBRARY_OVERRIDABLE"},
|
||||
)
|
||||
is_cache_hidden: bpy.props.BoolProperty(
|
||||
name="Cache Hidden",
|
||||
default=False,
|
||||
options={"LIBRARY_EDITABLE"},
|
||||
override={"LIBRARY_OVERRIDABLE"},
|
||||
)
|
||||
|
||||
def reset_properties(self):
|
||||
self.is_cache_coll = False
|
||||
self.cachefile = ""
|
||||
self.is_cache_loaded = False
|
||||
self.is_cache_hidden = False
|
||||
|
||||
|
||||
class CM_property_group_scene(bpy.types.PropertyGroup):
|
||||
|
||||
category: bpy.props.EnumProperty( # type: ignore
|
||||
items=(
|
||||
("EXPORT", "Export", "Import Cache Collections", "EXPORT", 0),
|
||||
("IMPORT", "Import", "Export Cache Collections", "IMPORT", 1),
|
||||
),
|
||||
default="EXPORT",
|
||||
update=prefs.category_update_version_model,
|
||||
)
|
||||
|
||||
colls_export_index: bpy.props.IntProperty(name="Index", default=0)
|
||||
|
||||
colls_import_index: bpy.props.IntProperty(name="Index", default=0)
|
||||
|
||||
cache_version: bpy.props.StringProperty(name="Version", default="v001")
|
||||
|
||||
colls_export: bpy.props.CollectionProperty(type=CM_collection_property)
|
||||
|
||||
colls_import: bpy.props.CollectionProperty(type=CM_collection_property)
|
||||
|
||||
cacheconfig: bpy.props.StringProperty(
|
||||
name="Cacheconfig File", get=propsdata.gen_cacheconfig_path_str
|
||||
)
|
||||
|
||||
cachedir: bpy.props.StringProperty(
|
||||
name="Cachedir", get=propsdata.gen_cachedir_path_str,
|
||||
description="Directory in which the cachefiles are located"
|
||||
)
|
||||
cache_version_dir: bpy.props.StringProperty(
|
||||
name="Cache version dir", get=propsdata.get_cache_version_dir_path_str
|
||||
)
|
||||
|
||||
use_cacheconfig_custom: bpy.props.BoolProperty(
|
||||
name="Custom Cacheconfig", default=False
|
||||
)
|
||||
cacheconfig_custom: bpy.props.StringProperty(
|
||||
name="Cacheconfig File",
|
||||
default="",
|
||||
subtype="FILE_PATH",
|
||||
)
|
||||
|
||||
xsamples: bpy.props.IntProperty(
|
||||
name="Transform Samples",
|
||||
description="Sets the xsamples argument of bpy.ops.wm.alembic_export to the specified value",
|
||||
default=1,
|
||||
min=1,
|
||||
max=128,
|
||||
step=1,
|
||||
)
|
||||
gsamples: bpy.props.IntProperty(
|
||||
name="Geometry Samples",
|
||||
description="Sets the gsamples argument of bpy.ops.wm.alembic_export to the specified value",
|
||||
default=1,
|
||||
min=1,
|
||||
max=128,
|
||||
step=1,
|
||||
)
|
||||
sh_open: bpy.props.FloatProperty(
|
||||
name="Shutter Open",
|
||||
description="Sets the sh_open argument of bpy.ops.wm.alembic_export to the specified value",
|
||||
default=0,
|
||||
min=-1.0,
|
||||
max=1.0,
|
||||
step=0.1,
|
||||
)
|
||||
sh_close: bpy.props.FloatProperty(
|
||||
name="Shutter Close",
|
||||
description="Sets the sh_close argument of bpy.ops.wm.alembic_export to the specified value",
|
||||
default=1,
|
||||
min=-1.0,
|
||||
max=1.0,
|
||||
step=0.1,
|
||||
)
|
||||
|
||||
frame_handles_left: bpy.props.IntProperty(
|
||||
name="Frame Handles Start",
|
||||
description="Caching starts at the frame in of the scene minus the specified amount of frame handles",
|
||||
default=10,
|
||||
min=0,
|
||||
step=1,
|
||||
)
|
||||
frame_handles_right: bpy.props.IntProperty(
|
||||
name="Frame Handles End",
|
||||
description="Caching stops at the frame out of the scene plus the specified amount of frame handles",
|
||||
default=10,
|
||||
min=0,
|
||||
step=1,
|
||||
)
|
||||
|
||||
@property
|
||||
def cachedir_path(self) -> Optional[Path]:
|
||||
if not self.is_cachedir_valid:
|
||||
return None
|
||||
|
||||
return Path(os.path.abspath(bpy.path.abspath(self.cachedir)))
|
||||
|
||||
@property
|
||||
def is_cachedir_valid(self) -> bool:
|
||||
if not self.cachedir:
|
||||
return False
|
||||
|
||||
if not bpy.data.filepath and self.cachedir.startswith("//"):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
@property
|
||||
def cache_version_dir_path(self) -> Optional[Path]:
|
||||
if not self.is_cache_version_dir_valid:
|
||||
return None
|
||||
|
||||
return Path(os.path.abspath(bpy.path.abspath(self.cache_version_dir)))
|
||||
|
||||
@property
|
||||
def is_cache_version_dir_valid(self) -> bool:
|
||||
if not self.cache_version_dir:
|
||||
return False
|
||||
|
||||
if not bpy.data.filepath and self.cache_version_dir.startswith("//"):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
@property
|
||||
def is_cacheconfig_valid(self) -> bool:
|
||||
if not self.cacheconfig:
|
||||
return False
|
||||
|
||||
if not bpy.data.filepath and self.cacheconfig.startswith("//"):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
@property
|
||||
def is_cacheconfig_custom_valid(self) -> bool:
|
||||
if not self.cacheconfig_custom:
|
||||
return False
|
||||
|
||||
if not bpy.data.filepath and self.cacheconfig_custom.startswith("//"):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
@property
|
||||
def cacheconfig_path(self) -> Optional[Path]:
|
||||
if not self.is_cacheconfig_valid:
|
||||
return None
|
||||
return Path(os.path.abspath(bpy.path.abspath(self.cacheconfig)))
|
||||
|
||||
@property
|
||||
def cacheconfig_custom_path(self) -> Optional[Path]:
|
||||
if not self.is_cacheconfig_custom_valid:
|
||||
return None
|
||||
return Path(os.path.abspath(bpy.path.abspath(self.cacheconfig_custom)))
|
||||
|
||||
|
||||
def get_cache_collections_import(
|
||||
context: bpy.types.Context,
|
||||
) -> Generator[bpy.types.Collection, None, None]:
|
||||
for item in context.scene.cm.colls_import:
|
||||
if item.coll_ptr:
|
||||
yield item.coll_ptr
|
||||
|
||||
|
||||
def get_cache_collections_export(
|
||||
context: bpy.types.Context,
|
||||
) -> Generator[bpy.types.Collection, None, None]:
|
||||
for item in context.scene.cm.colls_export:
|
||||
if item.coll_ptr:
|
||||
yield item.coll_ptr
|
||||
|
||||
|
||||
# ---------REGISTER ----------.
|
||||
|
||||
classes: List[Any] = [
|
||||
CM_collection_property,
|
||||
CM_property_group_collection,
|
||||
CM_property_group_scene,
|
||||
]
|
||||
|
||||
|
||||
def register():
|
||||
|
||||
for cls in classes:
|
||||
bpy.utils.register_class(cls)
|
||||
|
||||
# Scene Properties.
|
||||
bpy.types.Scene.cm = bpy.props.PointerProperty(type=CM_property_group_scene)
|
||||
|
||||
# Collection Properties.
|
||||
bpy.types.Collection.cm = bpy.props.PointerProperty(
|
||||
name="Cache Manager",
|
||||
type=CM_property_group_collection,
|
||||
description="Metadata that is required for the cache manager",
|
||||
override={"LIBRARY_OVERRIDABLE"},
|
||||
)
|
||||
|
||||
|
||||
def unregister():
|
||||
for cls in reversed(classes):
|
||||
bpy.utils.unregister_class(cls)
|
||||
@@ -0,0 +1,133 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import bpy
|
||||
|
||||
from .prefs import addon_prefs_get
|
||||
from .logger import LoggerFactory
|
||||
|
||||
logger = LoggerFactory.getLogger(__name__)
|
||||
|
||||
|
||||
def ui_redraw() -> None:
|
||||
"""
|
||||
Forces blender to redraw the UI.
|
||||
"""
|
||||
for screen in bpy.data.screens:
|
||||
for area in screen.areas:
|
||||
area.tag_redraw()
|
||||
|
||||
def _get_scene_name() -> str:
|
||||
if not bpy.data.filepath:
|
||||
return ""
|
||||
|
||||
filepath = Path(os.path.abspath(bpy.path.abspath(bpy.data.filepath)))
|
||||
return filepath.parents[1].name
|
||||
|
||||
|
||||
def _get_shot_name() -> str:
|
||||
if not bpy.data.filepath:
|
||||
return ""
|
||||
|
||||
filepath = Path(os.path.abspath(bpy.path.abspath(bpy.data.filepath)))
|
||||
return filepath.parents[0].name
|
||||
|
||||
|
||||
def _gen_cacheconfig_filename() -> str:
|
||||
return f"{_get_shot_name()}.cacheconfig.{bpy.context.scene.cm.cache_version}.json"
|
||||
|
||||
|
||||
def gen_cachedir_path_str(self: Any) -> str:
|
||||
addon_prefs = addon_prefs_get()
|
||||
|
||||
if not addon_prefs.is_cachedir_root_valid:
|
||||
return ""
|
||||
|
||||
p = (
|
||||
Path(addon_prefs.cachedir_root_path)
|
||||
/ _get_scene_name()
|
||||
/ _get_shot_name()
|
||||
/ bpy.context.scene.cm.cache_version
|
||||
)
|
||||
|
||||
return p.absolute().as_posix()
|
||||
|
||||
|
||||
def gen_cacheconfig_path_str(self: Any) -> str:
|
||||
|
||||
cachedir_str = gen_cachedir_path_str(None)
|
||||
|
||||
if not cachedir_str:
|
||||
return ""
|
||||
|
||||
p = Path(cachedir_str) / _gen_cacheconfig_filename()
|
||||
|
||||
return p.absolute().as_posix()
|
||||
|
||||
|
||||
def gen_cache_coll_filename(collection: bpy.types.Collection) -> str:
|
||||
return (
|
||||
f"{_get_shot_name()}.{collection.name}.{bpy.context.scene.cm.cache_version}.abc"
|
||||
)
|
||||
|
||||
|
||||
def gen_cachepath_collection(
|
||||
collection: bpy.types.Collection, context: bpy.types.Context
|
||||
) -> Path:
|
||||
cachedir_path = Path(gen_cachedir_path_str(None))
|
||||
|
||||
if not cachedir_path:
|
||||
raise ValueError(
|
||||
f"Failed to generate cachepath for collection: {collection.name}. Invalid cachepath: {str(cachedir_path)}"
|
||||
)
|
||||
return cachedir_path.joinpath(gen_cache_coll_filename(collection)).absolute()
|
||||
|
||||
|
||||
def get_cache_version_dir_path_str(self: Any) -> str:
|
||||
addon_prefs = addon_prefs_get()
|
||||
|
||||
if not addon_prefs.is_cachedir_root_valid:
|
||||
return ""
|
||||
|
||||
p = Path(addon_prefs.cachedir_root_path) / _get_scene_name() / _get_shot_name()
|
||||
|
||||
return p.absolute().as_posix()
|
||||
|
||||
|
||||
def rm_deleted_colls_from_list(context: bpy.types.Context) -> None:
|
||||
|
||||
for category in [context.scene.cm.colls_export, context.scene.cm.colls_import]:
|
||||
|
||||
category_name = context.scene.cm.category
|
||||
colls = [item.coll_ptr for item in category]
|
||||
colls.reverse()
|
||||
|
||||
# Make sure to remove list from the back to not throw off the subsequent indexes.
|
||||
for idx, coll in enumerate(colls):
|
||||
org_idx = len(colls) - 1 - idx
|
||||
if not coll:
|
||||
# Remove item for that category at that index.
|
||||
category.remove(org_idx)
|
||||
logger.info(
|
||||
"Removed index %i from %s list. Does not exists anymore.",
|
||||
org_idx,
|
||||
category_name.lower(),
|
||||
)
|
||||
# Update selection index.
|
||||
curr_index = context.scene.cm.colls_export_index
|
||||
if category_name == "IMPORT":
|
||||
curr_index = context.scene.cm.colls_import_index
|
||||
|
||||
if curr_index == org_idx and curr_index > 0:
|
||||
|
||||
if category_name == "IMPORT":
|
||||
context.scene.cm.colls_import_index = curr_index - 1
|
||||
else:
|
||||
context.scene.cm.colls_export_index = curr_index - 1
|
||||
ui_redraw()
|
||||
|
||||
@@ -0,0 +1,402 @@
|
||||
# SPDX-FileCopyrightText: 2021 Blender Studio Tools Authors
|
||||
#
|
||||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
|
||||
from pathlib import Path
|
||||
|
||||
import bpy
|
||||
|
||||
from . import (
|
||||
propsdata,
|
||||
props,
|
||||
cache
|
||||
)
|
||||
from .ops import (
|
||||
CM_OT_cache_export,
|
||||
CM_OT_cacheconfig_export,
|
||||
CM_OT_import_cache,
|
||||
CM_OT_import_colls_from_config,
|
||||
CM_OT_update_cache_colls_list,
|
||||
CM_OT_cache_list_actions,
|
||||
CM_OT_assign_cachefile,
|
||||
CM_OT_cache_show,
|
||||
CM_OT_cache_hide,
|
||||
CM_OT_cache_remove,
|
||||
CM_OT_set_cache_version,
|
||||
CM_OT_add_cache_version_increment,
|
||||
)
|
||||
|
||||
|
||||
class CM_PT_vi3d_cache(bpy.types.Panel):
|
||||
bl_category = "CacheManager"
|
||||
bl_label = "Cache"
|
||||
bl_space_type = "VIEW_3D"
|
||||
bl_region_type = "UI"
|
||||
bl_order = 10
|
||||
|
||||
def draw(self, context: bpy.types.Context) -> None:
|
||||
layout = self.layout
|
||||
split_factor = 0.225
|
||||
split_factor_small = 0.95
|
||||
# Category to choose between export / import.
|
||||
row = layout.row(align=True)
|
||||
row.prop(context.scene.cm, "category", expand=True)
|
||||
|
||||
# Add some space.
|
||||
row = layout.row(align=True)
|
||||
row.separator()
|
||||
|
||||
# Box for cache version and cacheconfig.
|
||||
box = layout.box()
|
||||
|
||||
# Version.
|
||||
version_text = self._get_version_text(context)
|
||||
|
||||
split = box.split(factor=split_factor, align=True)
|
||||
|
||||
# Version label.
|
||||
split.label(text="Version:")
|
||||
|
||||
if context.scene.cm.category == "EXPORT":
|
||||
sub_split = split.split(factor=split_factor_small, align=True)
|
||||
sub_split.operator(
|
||||
CM_OT_set_cache_version.bl_idname,
|
||||
icon="DOWNARROW_HLT",
|
||||
text=version_text,
|
||||
)
|
||||
sub_split.operator(
|
||||
CM_OT_add_cache_version_increment.bl_idname,
|
||||
icon="ADD",
|
||||
text="",
|
||||
)
|
||||
|
||||
else:
|
||||
split.operator(
|
||||
CM_OT_set_cache_version.bl_idname,
|
||||
icon="DOWNARROW_HLT",
|
||||
text=version_text,
|
||||
)
|
||||
|
||||
# Cachedir.
|
||||
split = box.split(factor=split_factor, align=True)
|
||||
|
||||
# Cachedir label.
|
||||
split.label(text="Cache Directory:")
|
||||
|
||||
if not context.scene.cm.is_cachedir_valid:
|
||||
split.label(text=f"Invalid. Check Addon Preferences.")
|
||||
|
||||
else:
|
||||
if context.scene.cm.category == "EXPORT":
|
||||
if context.scene.cm.cachedir_path.exists():
|
||||
sub_split = split.split(factor=1 - split_factor_small)
|
||||
sub_split.label(icon="ERROR")
|
||||
sub_split.prop(context.scene.cm, "cachedir", text="")
|
||||
|
||||
else:
|
||||
split.prop(context.scene.cm, "cachedir", text="")
|
||||
|
||||
else:
|
||||
if not context.scene.cm.cachedir_path.exists():
|
||||
split.label(text=f"Not found")
|
||||
else:
|
||||
split.prop(context.scene.cm, "cachedir", text="")
|
||||
|
||||
# Cacheconfig.
|
||||
split = box.split(factor=split_factor, align=True)
|
||||
# Cachedir label.
|
||||
split.label(text="Cacheconfig:")
|
||||
|
||||
if not context.scene.cm.is_cacheconfig_valid:
|
||||
if (
|
||||
context.scene.cm.use_cacheconfig_custom
|
||||
and context.scene.cm.category == "IMPORT"
|
||||
):
|
||||
sub_split = split.split(factor=0.95, align=True)
|
||||
sub_split.prop(context.scene.cm, "cacheconfig_custom", text="")
|
||||
sub_split.operator(
|
||||
CM_OT_import_colls_from_config.bl_idname, icon="PLAY", text=""
|
||||
)
|
||||
else:
|
||||
split.label(text=f"Invalid. Check Addon Preferences.")
|
||||
|
||||
row = box.row(align=True)
|
||||
row.prop(context.scene.cm, "use_cacheconfig_custom")
|
||||
|
||||
else:
|
||||
if context.scene.cm.category == "EXPORT":
|
||||
|
||||
if context.scene.cm.cacheconfig_path.exists():
|
||||
sub_split = split.split(factor=1 - split_factor_small)
|
||||
sub_split.label(icon="ERROR")
|
||||
sub_split.prop(context.scene.cm, "cacheconfig", text="")
|
||||
|
||||
else:
|
||||
split.prop(context.scene.cm, "cacheconfig", text="")
|
||||
else:
|
||||
if context.scene.cm.use_cacheconfig_custom:
|
||||
sub_split = split.split(factor=0.95, align=True)
|
||||
sub_split.prop(context.scene.cm, "cacheconfig_custom", text="")
|
||||
sub_split.operator(
|
||||
CM_OT_import_colls_from_config.bl_idname, icon="PLAY", text=""
|
||||
)
|
||||
|
||||
else:
|
||||
if not context.scene.cm.cacheconfig_path.exists():
|
||||
split.label(text=f"Not found")
|
||||
|
||||
else:
|
||||
sub_split = split.split(factor=0.95, align=True)
|
||||
sub_split.prop(context.scene.cm, "cacheconfig", text="")
|
||||
sub_split.operator(
|
||||
CM_OT_import_colls_from_config.bl_idname,
|
||||
icon="PLAY",
|
||||
text="",
|
||||
)
|
||||
row = box.row(align=True)
|
||||
row.prop(context.scene.cm, "use_cacheconfig_custom")
|
||||
|
||||
# Add some space.
|
||||
row = layout.row(align=True)
|
||||
row.separator()
|
||||
|
||||
# Collection operations.
|
||||
box = layout.box()
|
||||
box.label(text="Cache Collections", icon="OUTLINER_COLLECTION")
|
||||
if context.scene.cm.category == "EXPORT":
|
||||
|
||||
# Get collections.
|
||||
collections = list(props.get_cache_collections_export(context))
|
||||
|
||||
# Ui-list.
|
||||
row = box.row()
|
||||
row.template_list(
|
||||
"CM_UL_collection_cache_list_export",
|
||||
"collection_cache_list_export",
|
||||
context.scene.cm,
|
||||
"colls_export",
|
||||
context.scene.cm,
|
||||
"colls_export_index",
|
||||
rows=5,
|
||||
type="DEFAULT",
|
||||
)
|
||||
col = row.column(align=True)
|
||||
col.operator(
|
||||
CM_OT_update_cache_colls_list.bl_idname, icon="FILE_REFRESH", text=""
|
||||
)
|
||||
col.operator(
|
||||
CM_OT_cache_list_actions.bl_idname, icon="ADD", text=""
|
||||
).action = "ADD"
|
||||
col.operator(
|
||||
CM_OT_cache_list_actions.bl_idname, icon="REMOVE", text=""
|
||||
).action = "REMOVE"
|
||||
|
||||
row = box.row(align=True)
|
||||
row.operator(
|
||||
CM_OT_cache_export.bl_idname,
|
||||
text=f"Cache {len(collections)} Collections",
|
||||
icon="EXPORT",
|
||||
).do_all = True
|
||||
|
||||
row.operator(
|
||||
CM_OT_cacheconfig_export.bl_idname,
|
||||
text="",
|
||||
icon="ALIGN_LEFT",
|
||||
).do_all = True
|
||||
|
||||
else:
|
||||
# Get collections.
|
||||
collections = list(props.get_cache_collections_import(context))
|
||||
|
||||
# Ui-list.
|
||||
row = box.row()
|
||||
row.template_list(
|
||||
"CM_UL_collection_cache_list_import",
|
||||
"collection_cache_list_import",
|
||||
context.scene.cm,
|
||||
"colls_import",
|
||||
context.scene.cm,
|
||||
"colls_import_index",
|
||||
rows=5,
|
||||
type="DEFAULT",
|
||||
)
|
||||
col = row.column(align=True)
|
||||
col.operator(
|
||||
CM_OT_update_cache_colls_list.bl_idname, icon="FILE_REFRESH", text=""
|
||||
)
|
||||
col.operator(
|
||||
CM_OT_cache_list_actions.bl_idname, icon="ADD", text=""
|
||||
).action = "ADD"
|
||||
col.operator(
|
||||
CM_OT_cache_list_actions.bl_idname, icon="REMOVE", text=""
|
||||
).action = "REMOVE"
|
||||
|
||||
row = box.row(align=True)
|
||||
row.operator(
|
||||
CM_OT_import_cache.bl_idname,
|
||||
text="Load",
|
||||
icon="IMPORT",
|
||||
).do_all = True
|
||||
row.operator(
|
||||
CM_OT_cache_show.bl_idname, text="Show", icon="HIDE_OFF"
|
||||
).do_all = True
|
||||
|
||||
row.operator(
|
||||
CM_OT_cache_hide.bl_idname, text="Hide", icon="HIDE_ON"
|
||||
).do_all = True
|
||||
|
||||
row.operator(
|
||||
CM_OT_cache_remove.bl_idname, text="Remove", icon="REMOVE"
|
||||
).do_all = True
|
||||
|
||||
def _get_version_text(self, context: bpy.types.Context) -> str:
|
||||
version_text = "Select Version"
|
||||
|
||||
if context.scene.cm.cache_version:
|
||||
version_text = context.scene.cm.cache_version
|
||||
|
||||
return version_text
|
||||
|
||||
|
||||
class CM_PT_vi3d_advanced(bpy.types.Panel):
|
||||
bl_parent_id = "CM_PT_vi3d_cache"
|
||||
bl_category = "CacheManager"
|
||||
bl_label = "Advanced"
|
||||
bl_space_type = "VIEW_3D"
|
||||
bl_region_type = "UI"
|
||||
bl_order = 10
|
||||
bl_options = {"DEFAULT_CLOSED"}
|
||||
|
||||
def draw(self, context: bpy.types.Context) -> None:
|
||||
layout = self.layout
|
||||
|
||||
# Alembic export settings.
|
||||
box = layout.box()
|
||||
box.label(text="Alembic Export Settings", icon="MODIFIER")
|
||||
|
||||
# Frame range.
|
||||
col = box.column(align=True)
|
||||
col.prop(context.scene.cm, "frame_handles_left")
|
||||
col.prop(context.scene.cm, "frame_handles_right")
|
||||
|
||||
# Shutter.
|
||||
col = box.column(align=True)
|
||||
col.prop(context.scene.cm, "sh_open")
|
||||
col.prop(context.scene.cm, "sh_close")
|
||||
|
||||
# Samples.
|
||||
col = box.column(align=True)
|
||||
col.prop(context.scene.cm, "xsamples")
|
||||
col.prop(context.scene.cm, "gsamples")
|
||||
|
||||
|
||||
class CM_UL_collection_cache_list_export(bpy.types.UIList):
|
||||
def draw_item(
|
||||
self, context, layout, data, item, icon, active_data, active_propname, index
|
||||
):
|
||||
coll = item.coll_ptr
|
||||
|
||||
if self.layout_type in {"DEFAULT", "COMPACT"}:
|
||||
# Item got deleted.
|
||||
if not coll:
|
||||
layout.label(text=f"{item.name} was deleted")
|
||||
return
|
||||
|
||||
split = layout.split(factor=0.5, align=True)
|
||||
split.prop(
|
||||
coll,
|
||||
"name",
|
||||
text="",
|
||||
emboss=False,
|
||||
icon="OUTLINER_COLLECTION",
|
||||
)
|
||||
split = split.split(factor=0.75, align=True)
|
||||
split.label(text=f"/{propsdata.gen_cache_coll_filename(coll)}")
|
||||
split.operator(
|
||||
CM_OT_cache_export.bl_idname,
|
||||
text="",
|
||||
icon="EXPORT",
|
||||
).index = index
|
||||
# Disable row if coll not valid.
|
||||
if not cache.is_valid_cache_coll(coll):
|
||||
split.enabled = False
|
||||
|
||||
elif self.layout_type in {"GRID"}:
|
||||
layout.alignment = "CENTER"
|
||||
layout.label(text="", icon_value=layout.icon(coll))
|
||||
|
||||
|
||||
class CM_UL_collection_cache_list_import(bpy.types.UIList):
|
||||
def draw_item(
|
||||
self, context, layout, data, item, icon, active_data, active_propname, index
|
||||
):
|
||||
coll = item.coll_ptr
|
||||
|
||||
if self.layout_type in {"DEFAULT", "COMPACT"}:
|
||||
# Item got deleted.
|
||||
if not coll:
|
||||
layout.label(text=f"{item.name} was deleted")
|
||||
return
|
||||
|
||||
split = layout.split(factor=0.4, align=True)
|
||||
split.prop(
|
||||
coll,
|
||||
"name",
|
||||
text="",
|
||||
emboss=False,
|
||||
icon="OUTLINER_COLLECTION",
|
||||
)
|
||||
split = split.split(factor=0.7, align=True)
|
||||
|
||||
cachefile = coll.cm.cachefile
|
||||
op_text = "Select Cachefile"
|
||||
if cachefile:
|
||||
op_text = Path(cachefile).name
|
||||
|
||||
split.operator(
|
||||
CM_OT_assign_cachefile.bl_idname, text=op_text, icon="DOWNARROW_HLT"
|
||||
).index = index
|
||||
|
||||
if not coll.cm.is_cache_loaded:
|
||||
split.operator(
|
||||
CM_OT_import_cache.bl_idname,
|
||||
text="",
|
||||
icon="IMPORT",
|
||||
).index = index
|
||||
else:
|
||||
split.operator(
|
||||
CM_OT_cache_remove.bl_idname, text="", icon="REMOVE"
|
||||
).index = index
|
||||
|
||||
if coll.cm.is_cache_hidden:
|
||||
split.operator(
|
||||
CM_OT_cache_show.bl_idname, text="", icon="HIDE_ON"
|
||||
).index = index
|
||||
else:
|
||||
split.operator(
|
||||
CM_OT_cache_hide.bl_idname, text="", icon="HIDE_OFF"
|
||||
).index = index
|
||||
|
||||
elif self.layout_type in {"GRID"}:
|
||||
layout.alignment = "CENTER"
|
||||
layout.label(text="", icon_value=layout.icon(item.coll_ptr))
|
||||
|
||||
|
||||
# ---------REGISTER ----------.
|
||||
|
||||
classes = [
|
||||
CM_UL_collection_cache_list_export,
|
||||
CM_UL_collection_cache_list_import,
|
||||
CM_PT_vi3d_cache,
|
||||
CM_PT_vi3d_advanced,
|
||||
]
|
||||
|
||||
|
||||
def register():
|
||||
for cls in classes:
|
||||
bpy.utils.register_class(cls)
|
||||
|
||||
|
||||
def unregister():
|
||||
for cls in reversed(classes):
|
||||
bpy.utils.unregister_class(cls)
|
||||
Reference in New Issue
Block a user