如何在动态壁纸中按照中心裁剪和适合宽度/高度调整视频?

时间:2018-04-29 22:24:39

标签: android android-mediaplayer live-wallpaper

背景

我正在制作可以显示视频的动态壁纸。一开始我认为这将非常困难,所以有些人建议使用OpenGL解决方案或其他非常复杂的解决方案(例如this one)。

无论如何,为此,我发现各个地方都在谈论它,基于这个github library(有一些错误),我终于开始工作了。

问题

虽然我成功播放了一段视频,但我无法找到控制它与屏幕分辨率相比的显示方式。

目前它总是被拉伸到屏幕尺寸,这意味着这个(视频取自here):

enter image description here

显示为:

enter image description here

原因是不同的宽高比:560x320(视频分辨率)与1080x1920(设备分辨率)。

注意:我非常了解缩放视频的解决方案,这些解决方案可以在各种Github存储库中使用(例如here),但我会询问有关动态壁纸的信息。因此,它没有View,所以它对如何做事情的限制更多。更具体地说,解决方案不能具有任何类型的布局,TextureView或SurfaceView,或任何其他类型的视图。

我尝试了什么

我尝试使用SurfaceHolder的各种字段和功能,但到目前为止没有运气。例子:

这是我目前编写的代码(完整项目可用here):

class MovieLiveWallpaperService : WallpaperService() {
    override fun onCreateEngine(): WallpaperService.Engine {
        return VideoLiveWallpaperEngine()
    }

    private enum class PlayerState {
        NONE, PREPARING, READY, PLAYING
    }

    inner class VideoLiveWallpaperEngine : WallpaperService.Engine() {
        private var mp: MediaPlayer? = null
        private var playerState: PlayerState = PlayerState.NONE

        override fun onSurfaceCreated(holder: SurfaceHolder) {
            super.onSurfaceCreated(holder)
            Log.d("AppLog", "onSurfaceCreated")
            mp = MediaPlayer()
            val mySurfaceHolder = MySurfaceHolder(holder)
            mp!!.setDisplay(mySurfaceHolder)
            mp!!.isLooping = true
            mp!!.setVolume(0.0f, 0.0f)
            mp!!.setOnPreparedListener { mp ->
                playerState = PlayerState.READY
                setPlay(true)
            }
            try {
                //mp!!.setDataSource(this@MovieLiveWallpaperService, Uri.parse("http://techslides.com/demos/sample-videos/small.mp4"))
                mp!!.setDataSource(this@MovieLiveWallpaperService, Uri.parse("android.resource://" + packageName + "/" + R.raw.small))
            } catch (e: Exception) {
            }
        }

        override fun onDestroy() {
            super.onDestroy()
            Log.d("AppLog", "onDestroy")
            if (mp == null)
                return
            mp!!.stop()
            mp!!.release()
            playerState = PlayerState.NONE
        }

        private fun setPlay(play: Boolean) {
            if (mp == null)
                return
            if (play == mp!!.isPlaying)
                return
            when {
                !play -> {
                    mp!!.pause()
                    playerState = PlayerState.READY
                }
                mp!!.isPlaying -> return
                playerState == PlayerState.READY -> {
                    Log.d("AppLog", "ready, so starting to play")
                    mp!!.start()
                    playerState = PlayerState.PLAYING
                }
                playerState == PlayerState.NONE -> {
                    Log.d("AppLog", "not ready, so preparing")
                    mp!!.prepareAsync()
                    playerState = PlayerState.PREPARING
                }
            }
        }

        override fun onVisibilityChanged(visible: Boolean) {
            super.onVisibilityChanged(visible)
            Log.d("AppLog", "onVisibilityChanged:" + visible + " " + playerState)
            if (mp == null)
                return
            setPlay(visible)
        }

    }

    class MySurfaceHolder(private val surfaceHolder: SurfaceHolder) : SurfaceHolder {
        override fun addCallback(callback: SurfaceHolder.Callback) = surfaceHolder.addCallback(callback)

        override fun getSurface() = surfaceHolder.surface!!

        override fun getSurfaceFrame() = surfaceHolder.surfaceFrame

        override fun isCreating(): Boolean = surfaceHolder.isCreating

        override fun lockCanvas(): Canvas = surfaceHolder.lockCanvas()

        override fun lockCanvas(dirty: Rect): Canvas = surfaceHolder.lockCanvas(dirty)

        override fun removeCallback(callback: SurfaceHolder.Callback) = surfaceHolder.removeCallback(callback)

        override fun setFixedSize(width: Int, height: Int) = surfaceHolder.setFixedSize(width, height)

        override fun setFormat(format: Int) = surfaceHolder.setFormat(format)

        override fun setKeepScreenOn(screenOn: Boolean) {}

        override fun setSizeFromLayout() = surfaceHolder.setSizeFromLayout()

        override fun setType(type: Int) = surfaceHolder.setType(type)

        override fun unlockCanvasAndPost(canvas: Canvas) = surfaceHolder.unlockCanvasAndPost(canvas)
    }
}

问题

我想知道如何根据我们对ImageView的内容调整内容的比例,同时保持宽高比:

  1. 中心裁剪 - 适合100%的容器(在这种情况下为屏幕),在需要时在侧面(顶部和底部或左右两侧)裁剪。没有伸展任何东西。这意味着内容似乎很好,但并非所有内容都可能显示出来。
  2. fit-center - 拉伸以适合宽度/高度
  3. center-inside - 设置为原始尺寸,居中,并且只有在太大时才能拉伸以适合宽度/高度。

5 个答案:

答案 0 :(得分:5)

您可以使用TextureView实现此目的。 (surfaceView也不起作用)。我发现了一些代码,可以帮助您实现这一目标。
在此演示中,您可以按三种类型的 center,top和bottom 裁剪视频。

TextureVideoView.java

public class TextureVideoView extends TextureView implements TextureView.SurfaceTextureListener {

    // Indicate if logging is on
    public static final boolean LOG_ON = true;

    // Log tag
    private static final String TAG = TextureVideoView.class.getName();

    private MediaPlayer mMediaPlayer;

    private float mVideoHeight;
    private float mVideoWidth;

    private boolean mIsDataSourceSet;
    private boolean mIsViewAvailable;
    private boolean mIsVideoPrepared;
    private boolean mIsPlayCalled;

    private ScaleType mScaleType;
    private State mState;

    public enum ScaleType {
        CENTER_CROP, TOP, BOTTOM
    }

    public enum State {
        UNINITIALIZED, PLAY, STOP, PAUSE, END
    }

    public TextureVideoView(Context context) {
        super(context);
        initView();
    }

    public TextureVideoView(Context context, AttributeSet attrs) {
        super(context, attrs);
        initView();
    }

    public TextureVideoView(Context context, AttributeSet attrs, int defStyle) {
        super(context, attrs, defStyle);
        initView();
    }

    private void initView() {
        initPlayer();
        setScaleType(ScaleType.CENTER_CROP);
        setSurfaceTextureListener(this);
    }

    public void setScaleType(ScaleType scaleType) {
        mScaleType = scaleType;
    }

    private void updateTextureViewSize() {
        float viewWidth = getWidth();
        float viewHeight = getHeight();

        float scaleX = 1.0f;
        float scaleY = 1.0f;

        if (mVideoWidth > viewWidth && mVideoHeight > viewHeight) {
            scaleX = mVideoWidth / viewWidth;
            scaleY = mVideoHeight / viewHeight;
        } else if (mVideoWidth < viewWidth && mVideoHeight < viewHeight) {
            scaleY = viewWidth / mVideoWidth;
            scaleX = viewHeight / mVideoHeight;
        } else if (viewWidth > mVideoWidth) {
            scaleY = (viewWidth / mVideoWidth) / (viewHeight / mVideoHeight);
        } else if (viewHeight > mVideoHeight) {
            scaleX = (viewHeight / mVideoHeight) / (viewWidth / mVideoWidth);
        }

        // Calculate pivot points, in our case crop from center
        int pivotPointX;
        int pivotPointY;

        switch (mScaleType) {
            case TOP:
                pivotPointX = 0;
                pivotPointY = 0;
                break;
            case BOTTOM:
                pivotPointX = (int) (viewWidth);
                pivotPointY = (int) (viewHeight);
                break;
            case CENTER_CROP:
                pivotPointX = (int) (viewWidth / 2);
                pivotPointY = (int) (viewHeight / 2);
                break;
            default:
                pivotPointX = (int) (viewWidth / 2);
                pivotPointY = (int) (viewHeight / 2);
                break;
        }

        Matrix matrix = new Matrix();
        matrix.setScale(scaleX, scaleY, pivotPointX, pivotPointY);

        setTransform(matrix);
    }

    private void initPlayer() {
        if (mMediaPlayer == null) {
            mMediaPlayer = new MediaPlayer();
        } else {
            mMediaPlayer.reset();
        }
        mIsVideoPrepared = false;
        mIsPlayCalled = false;
        mState = State.UNINITIALIZED;
    }

    /**
     * @see MediaPlayer#setDataSource(String)
     */
    public void setDataSource(String path) {
        initPlayer();

        try {
            mMediaPlayer.setDataSource(path);
            mIsDataSourceSet = true;
            prepare();
        } catch (IOException e) {
            Log.d(TAG, e.getMessage());
        }
    }

    /**
     * @see MediaPlayer#setDataSource(Context, Uri)
     */
    public void setDataSource(Context context, Uri uri) {
        initPlayer();

        try {
            mMediaPlayer.setDataSource(context, uri);
            mIsDataSourceSet = true;
            prepare();
        } catch (IOException e) {
            Log.d(TAG, e.getMessage());
        }
    }

    /**
     * @see MediaPlayer#setDataSource(java.io.FileDescriptor)
     */
    public void setDataSource(AssetFileDescriptor afd) {
        initPlayer();

        try {
            long startOffset = afd.getStartOffset();
            long length = afd.getLength();
            mMediaPlayer.setDataSource(afd.getFileDescriptor(), startOffset, length);
            mIsDataSourceSet = true;
            prepare();
        } catch (IOException e) {
            Log.d(TAG, e.getMessage());
        }
    }

    private void prepare() {
        try {
            mMediaPlayer.setOnVideoSizeChangedListener(
                    new MediaPlayer.OnVideoSizeChangedListener() {
                        @Override
                        public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
                            mVideoWidth = width;
                            mVideoHeight = height;
                            updateTextureViewSize();
                        }
                    }
            );
            mMediaPlayer.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
                @Override
                public void onCompletion(MediaPlayer mp) {
                    mState = State.END;
                    log("Video has ended.");

                    if (mListener != null) {
                        mListener.onVideoEnd();
                    }
                }
            });

            // don't forget to call MediaPlayer.prepareAsync() method when you use constructor for
            // creating MediaPlayer
            mMediaPlayer.prepareAsync();

            // Play video when the media source is ready for playback.
            mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
                @Override
                public void onPrepared(MediaPlayer mediaPlayer) {
                    mIsVideoPrepared = true;
                    if (mIsPlayCalled && mIsViewAvailable) {
                        log("Player is prepared and play() was called.");
                        play();
                    }

                    if (mListener != null) {
                        mListener.onVideoPrepared();
                    }
                }
            });

        } catch (IllegalArgumentException e) {
            Log.d(TAG, e.getMessage());
        } catch (SecurityException e) {
            Log.d(TAG, e.getMessage());
        } catch (IllegalStateException e) {
            Log.d(TAG, e.toString());
        }
    }

    /**
     * Play or resume video. Video will be played as soon as view is available and media player is
     * prepared.
     *
     * If video is stopped or ended and play() method was called, video will start over.
     */
    public void play() {
        if (!mIsDataSourceSet) {
            log("play() was called but data source was not set.");
            return;
        }

        mIsPlayCalled = true;

        if (!mIsVideoPrepared) {
            log("play() was called but video is not prepared yet, waiting.");
            return;
        }

        if (!mIsViewAvailable) {
            log("play() was called but view is not available yet, waiting.");
            return;
        }

        if (mState == State.PLAY) {
            log("play() was called but video is already playing.");
            return;
        }

        if (mState == State.PAUSE) {
            log("play() was called but video is paused, resuming.");
            mState = State.PLAY;
            mMediaPlayer.start();
            return;
        }

        if (mState == State.END || mState == State.STOP) {
            log("play() was called but video already ended, starting over.");
            mState = State.PLAY;
            mMediaPlayer.seekTo(0);
            mMediaPlayer.start();
            return;
        }

        mState = State.PLAY;
        mMediaPlayer.start();
    }

    /**
     * Pause video. If video is already paused, stopped or ended nothing will happen.
     */
    public void pause() {
        if (mState == State.PAUSE) {
            log("pause() was called but video already paused.");
            return;
        }

        if (mState == State.STOP) {
            log("pause() was called but video already stopped.");
            return;
        }

        if (mState == State.END) {
            log("pause() was called but video already ended.");
            return;
        }

        mState = State.PAUSE;
        if (mMediaPlayer.isPlaying()) {
            mMediaPlayer.pause();
        }
    }

    /**
     * Stop video (pause and seek to beginning). If video is already stopped or ended nothing will
     * happen.
     */
    public void stop() {
        if (mState == State.STOP) {
            log("stop() was called but video already stopped.");
            return;
        }

        if (mState == State.END) {
            log("stop() was called but video already ended.");
            return;
        }

        mState = State.STOP;
        if (mMediaPlayer.isPlaying()) {
            mMediaPlayer.pause();
            mMediaPlayer.seekTo(0);
        }
    }

    /**
     * @see MediaPlayer#setLooping(boolean)
     */
    public void setLooping(boolean looping) {
        mMediaPlayer.setLooping(looping);
    }

    /**
     * @see MediaPlayer#seekTo(int)
     */
    public void seekTo(int milliseconds) {
        mMediaPlayer.seekTo(milliseconds);
    }

    /**
     * @see MediaPlayer#getDuration()
     */
    public int getDuration() {
        return mMediaPlayer.getDuration();
    }

    static void log(String message) {
        if (LOG_ON) {
            Log.d(TAG, message);
        }
    }

    private MediaPlayerListener mListener;

    /**
     * Listener trigger 'onVideoPrepared' and `onVideoEnd` events
     */
    public void setListener(MediaPlayerListener listener) {
        mListener = listener;
    }

    public interface MediaPlayerListener {

        public void onVideoPrepared();

        public void onVideoEnd();
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height) {
        Surface surface = new Surface(surfaceTexture);
        mMediaPlayer.setSurface(surface);
        mIsViewAvailable = true;
        if (mIsDataSourceSet && mIsPlayCalled && mIsVideoPrepared) {
            log("View is available and play() was called.");
            play();
        }
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {

    }
}

之后,使用该类,如 MainActivity.java

中的以下代码
public class MainActivity extends AppCompatActivity implements View.OnClickListener,
        ActionBar.OnNavigationListener {

    // Video file url
    private static final String FILE_URL = "http://techslides.com/demos/sample-videos/small.mp4";
    private TextureVideoView mTextureVideoView;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        initView();
        initActionBar();

        if (!isWIFIOn(getBaseContext())) {
            Toast.makeText(getBaseContext(), "You need internet connection to stream video",
                    Toast.LENGTH_LONG).show();
        }
    }

    private void initActionBar() {
        ActionBar actionBar = getSupportActionBar();
        actionBar.setNavigationMode(ActionBar.NAVIGATION_MODE_LIST);
        actionBar.setDisplayShowTitleEnabled(false);

        SpinnerAdapter mSpinnerAdapter = ArrayAdapter.createFromResource(this, R.array.action_list,
                android.R.layout.simple_spinner_dropdown_item);
        actionBar.setListNavigationCallbacks(mSpinnerAdapter, this);
    }

    private void initView() {
        mTextureVideoView = (TextureVideoView) findViewById(R.id.cropTextureView);

        findViewById(R.id.btnPlay).setOnClickListener(this);
        findViewById(R.id.btnPause).setOnClickListener(this);
        findViewById(R.id.btnStop).setOnClickListener(this);
    }

    @Override
    public void onClick(View v) {
        switch (v.getId()) {
            case R.id.btnPlay:
                mTextureVideoView.play();
                break;
            case R.id.btnPause:
                mTextureVideoView.pause();
                break;
            case R.id.btnStop:
                mTextureVideoView.stop();
                break;
        }
    }

    final int indexCropCenter = 0;
    final int indexCropTop = 1;
    final int indexCropBottom = 2;

    @Override
    public boolean onNavigationItemSelected(int itemPosition, long itemId) {
        switch (itemPosition) {
            case indexCropCenter:
                mTextureVideoView.stop();
                mTextureVideoView.setScaleType(TextureVideoView.ScaleType.CENTER_CROP);
                mTextureVideoView.setDataSource(FILE_URL);
                mTextureVideoView.play();
                break;
            case indexCropTop:
                mTextureVideoView.stop();
                mTextureVideoView.setScaleType(TextureVideoView.ScaleType.TOP);
                mTextureVideoView.setDataSource(FILE_URL);
                mTextureVideoView.play();
                break;
            case indexCropBottom:
                mTextureVideoView.stop();
                mTextureVideoView.setScaleType(TextureVideoView.ScaleType.BOTTOM);
                mTextureVideoView.setDataSource(FILE_URL);
                mTextureVideoView.play();
                break;
        }
        return true;
    }

    public static boolean isWIFIOn(Context context) {
        ConnectivityManager connMgr =
                (ConnectivityManager) context.getSystemService(Context.CONNECTIVITY_SERVICE);
        NetworkInfo networkInfo = connMgr.getNetworkInfo(ConnectivityManager.TYPE_WIFI);

        return (networkInfo != null && networkInfo.isConnected());
    }
}

和布局 activity_main.xml 文件位于以下

<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent">

    <com.example.videocropdemo.crop.TextureVideoView
        android:id="@+id/cropTextureView"
        android:layout_width="fill_parent"
        android:layout_height="fill_parent"
        android:layout_centerInParent="true" />

    <LinearLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:layout_alignParentBottom="true"
        android:layout_margin="16dp"
        android:orientation="horizontal">

        <Button
            android:id="@+id/btnPlay"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:text="Play" />

        <Button
            android:id="@+id/btnPause"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:text="Pause" />

        <Button
            android:id="@+id/btnStop"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:text="Stop" />
    </LinearLayout>
</RelativeLayout>
中心裁剪代码的

输出类似于

enter image description here

答案 1 :(得分:4)

因此,我还无法获得您所要求的所有比例尺类型,但是我可以使用exo player轻松使fit-xy和center-crop正常工作。完整的代码可以在https://github.com/yperess/StackOverflow/tree/50091878上看到,我将在更新它的同时进行更新。最后,我还将填写MainActivity,以允许您选择缩放类型作为设置(我将通过一个简单的PreferenceActivity进行设置),并在服务端读取共享的首选项值。

总体思路是,深层MediaCodec已经实现了fit-xy和center-crop,这实际上是访问视图层次结构所需的仅有两种模式。之所以如此,是因为在表面具有重力并按比例缩放以匹配视频大小*最小缩放比例的情况下,“适合中心”,“适合顶部”,“适合底部”实际上都是fit-xy。为了使这些工作正常进行,我认为需要进行的工作是创建一个OpenGL上下文并提供SurfaceTexture。此SurfaceTexture可以用存根Surface包裹,该存根可以传递给exo播放器。加载视频后,我们就可以设置它们的大小,因为我们已经创建了它们。我们在SurfaceTexture上也有一个回调,可以让我们知道框架准备就绪的时间。在这一点上,我们应该能够修改框架(希望只是使用简单的矩阵比例和变换)。

此处的关键组件是创建exo播放器:

    private fun initExoMediaPlayer(): SimpleExoPlayer {
        val videoTrackSelectionFactory = AdaptiveTrackSelection.Factory(bandwidthMeter)
        val trackSelector = DefaultTrackSelector(videoTrackSelectionFactory)
        val player = ExoPlayerFactory.newSimpleInstance(this@MovieLiveWallpaperService,
                trackSelector)
        player.playWhenReady = true
        player.repeatMode = Player.REPEAT_MODE_ONE
        player.volume = 0f
        if (mode == Mode.CENTER_CROP) {
            player.videoScalingMode = C.VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING
        } else {
            player.videoScalingMode = C.VIDEO_SCALING_MODE_SCALE_TO_FIT
        }
        if (mode == Mode.FIT_CENTER) {
            player.addVideoListener(this)
        }
        return player
    }

然后加载视频:

    override fun onSurfaceCreated(holder: SurfaceHolder) {
        super.onSurfaceCreated(holder)
        if (mode == Mode.FIT_CENTER) {
            // We need to somehow wrap the surface or set some scale factor on exo player here.
            // Most likely this will require creating a SurfaceTexture and attaching it to an
            // OpenGL context. Then for each frame, writing it to the original surface but with
            // an offset
            exoMediaPlayer.setVideoSurface(holder.surface)
        } else {
            exoMediaPlayer.setVideoSurfaceHolder(holder)
        }

        val videoUri = RawResourceDataSource.buildRawResourceUri(R.raw.small)
        val dataSourceFactory = DataSource.Factory { RawResourceDataSource(context) }
        val mediaSourceFactory = ExtractorMediaSource.Factory(dataSourceFactory)
        exoMediaPlayer.prepare(mediaSourceFactory.createMediaSource(videoUri))
    }

更新:

让它正常工作,我明天需要在发布代码之前对其进行清理,但这是一个预览... fit_center

我最终完成的工作基本上是使用GLSurfaceView并将其撕裂。如果您查看它的源代码,唯一的缺憾就是无法在墙纸中使用它,因为它仅在连接到窗口时才启动GLThread。因此,如果您复制相同的代码,但允许手动启动GLThread,则可以继续。之后,只需缩放到适合的最小比例并移动要绘制的四边形,即可跟踪屏幕与视频的大小。

代码的已知问题: 1. GLThread有一个小错误,我无法发现。似乎有一个简单的计时问题,当线程暂停时,我接到signallAll()的呼叫,实际上并没有等待任何东西。 2.我没有费心在渲染器中动态修改模式。应该不难。创建引擎时添加偏好设置侦听器,然后在scale_type更改时更新渲染器。

更新: 所有问题均已解决。 signallAll()之所以抛出,是因为我错过了检查以查看我们是否真正拥有了锁。我还添加了一个侦听器来动态更新比例类型,因此现在所有比例类型都使用GlEngine。

享受!

答案 2 :(得分:1)

我找到这篇文章:How to set video as live wallpaper and keep video aspect ratio(width and height)

以上文章的来源很简单,只需单击“设置墙纸”按钮,如果您想使用全功能应用程序,请参见https://github.com/AlynxZhou/alynx-live-wallpaper

关键是使用glsurfaceview而不是wallpaperservice默认的Surfaceview,创建自定义glsurfaceview渲染器,glsurfaceview可以使用opengl进行显示,因此问题就变成了“如何使用glsurfaceview播放视频”或“如何使用opengl播放视频”

如何使用glsurfaceview代替wallpaperservice默认的Surfaceview:

public class GLWallpaperService extends WallpaperService {
...

    class GLWallpaperEngine extends Engine {
...

        private class GLWallpaperSurfaceView extends GLSurfaceView {
            @SuppressWarnings("unused")
            private static final String TAG = "GLWallpaperSurface";

            public GLWallpaperSurfaceView(Context context) {
                super(context);
            }

            /**
             * This is a hack. Because Android Live Wallpaper only has a Surface.
             * So we create a GLSurfaceView, and when drawing to its Surface,
             * we replace it with WallpaperEngine's Surface.
             */
            @Override
            public SurfaceHolder getHolder() {
                return getSurfaceHolder();
            }

            void onDestroy() {
                super.onDetachedFromWindow();
            }
        }

答案 3 :(得分:0)

我的解决方案是在动态壁纸中使用gif(大小和fps与视频相同)而不是视频

查看我的答案:https://stackoverflow.com/a/60425717/6011193,WallpaperService最适合gif

使用ffmpeg在计算机中将视频转换为gif

或在android中,视频可以在android代码中转换为gif:请参见https://stackoverflow.com/a/16749143/6011193

答案 4 :(得分:-7)

您可以使用Glide进行GIF和图片加载,并根据需要提供缩放选项。基于文档https://bumptech.github.io/glide/doc/targets.html#sizes-and-dimensionshttps://futurestud.io/tutorials/glide-image-resizing-scaling这个。

Glide v4需要Android冰淇淋三明治(API级别14)或更高。

赞:

public static void loadCircularImageGlide(String imagePath, ImageView view) {
    Glide.with(view.getContext())
            .load(imagePath)
            .asGif()
            .override(600, 200) // resizes the image to these dimensions (in pixel). resize does not respect aspect ratio
            .error(R.drawable.create_timeline_placeholder)
            .fitCenter() // scaling options
            .transform(new CircularTransformation(view.getContext())) // Even you can Give image tranformation too
            .into(view);
}
相关问题